Dec 08 19:17:44 localhost kernel: Linux version 5.14.0-645.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Fri Nov 28 14:01:17 UTC 2025
Dec 08 19:17:44 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 08 19:17:44 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 08 19:17:44 localhost kernel: BIOS-provided physical RAM map:
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 08 19:17:44 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 08 19:17:44 localhost kernel: NX (Execute Disable) protection: active
Dec 08 19:17:44 localhost kernel: APIC: Static calls initialized
Dec 08 19:17:44 localhost kernel: SMBIOS 2.8 present.
Dec 08 19:17:44 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 08 19:17:44 localhost kernel: Hypervisor detected: KVM
Dec 08 19:17:44 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 08 19:17:44 localhost kernel: kvm-clock: using sched offset of 3129374225 cycles
Dec 08 19:17:44 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 08 19:17:44 localhost kernel: tsc: Detected 2800.000 MHz processor
Dec 08 19:17:44 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Dec 08 19:17:44 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Dec 08 19:17:44 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 08 19:17:44 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 08 19:17:44 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 08 19:17:44 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 08 19:17:44 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 08 19:17:44 localhost kernel: Using GB pages for direct mapping
Dec 08 19:17:44 localhost kernel: RAMDISK: [mem 0x2d472000-0x32a30fff]
Dec 08 19:17:44 localhost kernel: ACPI: Early table checksum verification disabled
Dec 08 19:17:44 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 08 19:17:44 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 08 19:17:44 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 08 19:17:44 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 08 19:17:44 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 08 19:17:44 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 08 19:17:44 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 08 19:17:44 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 08 19:17:44 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 08 19:17:44 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 08 19:17:44 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 08 19:17:44 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 08 19:17:44 localhost kernel: No NUMA configuration found
Dec 08 19:17:44 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 08 19:17:44 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec 08 19:17:44 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 08 19:17:44 localhost kernel: Zone ranges:
Dec 08 19:17:44 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 08 19:17:44 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 08 19:17:44 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 08 19:17:44 localhost kernel:   Device   empty
Dec 08 19:17:44 localhost kernel: Movable zone start for each node
Dec 08 19:17:44 localhost kernel: Early memory node ranges
Dec 08 19:17:44 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 08 19:17:44 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 08 19:17:44 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 08 19:17:44 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 08 19:17:44 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 08 19:17:44 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 08 19:17:44 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 08 19:17:44 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Dec 08 19:17:44 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 08 19:17:44 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 08 19:17:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 08 19:17:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 08 19:17:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 08 19:17:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 08 19:17:44 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 08 19:17:44 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 08 19:17:44 localhost kernel: TSC deadline timer available
Dec 08 19:17:44 localhost kernel: CPU topo: Max. logical packages:   8
Dec 08 19:17:44 localhost kernel: CPU topo: Max. logical dies:       8
Dec 08 19:17:44 localhost kernel: CPU topo: Max. dies per package:   1
Dec 08 19:17:44 localhost kernel: CPU topo: Max. threads per core:   1
Dec 08 19:17:44 localhost kernel: CPU topo: Num. cores per package:     1
Dec 08 19:17:44 localhost kernel: CPU topo: Num. threads per package:   1
Dec 08 19:17:44 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 08 19:17:44 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 08 19:17:44 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 08 19:17:44 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 08 19:17:44 localhost kernel: Booting paravirtualized kernel on KVM
Dec 08 19:17:44 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 08 19:17:44 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 08 19:17:44 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 08 19:17:44 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Dec 08 19:17:44 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Dec 08 19:17:44 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 08 19:17:44 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 08 19:17:44 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64", will be passed to user space.
Dec 08 19:17:44 localhost kernel: random: crng init done
Dec 08 19:17:44 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 08 19:17:44 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 08 19:17:44 localhost kernel: Fallback order for Node 0: 0 
Dec 08 19:17:44 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 08 19:17:44 localhost kernel: Policy zone: Normal
Dec 08 19:17:44 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 08 19:17:44 localhost kernel: software IO TLB: area num 8.
Dec 08 19:17:44 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 08 19:17:44 localhost kernel: ftrace: allocating 49335 entries in 193 pages
Dec 08 19:17:44 localhost kernel: ftrace: allocated 193 pages with 3 groups
Dec 08 19:17:44 localhost kernel: Dynamic Preempt: voluntary
Dec 08 19:17:44 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 08 19:17:44 localhost kernel: rcu:         RCU event tracing is enabled.
Dec 08 19:17:44 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 08 19:17:44 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Dec 08 19:17:44 localhost kernel:         Rude variant of Tasks RCU enabled.
Dec 08 19:17:44 localhost kernel:         Tracing variant of Tasks RCU enabled.
Dec 08 19:17:44 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 08 19:17:44 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 08 19:17:44 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 08 19:17:44 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 08 19:17:44 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 08 19:17:44 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 08 19:17:44 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 08 19:17:44 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 08 19:17:44 localhost kernel: Console: colour VGA+ 80x25
Dec 08 19:17:44 localhost kernel: printk: console [ttyS0] enabled
Dec 08 19:17:44 localhost kernel: ACPI: Core revision 20230331
Dec 08 19:17:44 localhost kernel: APIC: Switch to symmetric I/O mode setup
Dec 08 19:17:44 localhost kernel: x2apic enabled
Dec 08 19:17:44 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Dec 08 19:17:44 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 08 19:17:44 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Dec 08 19:17:44 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 08 19:17:44 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 08 19:17:44 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 08 19:17:44 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 08 19:17:44 localhost kernel: Spectre V2 : Mitigation: Retpolines
Dec 08 19:17:44 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 08 19:17:44 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 08 19:17:44 localhost kernel: RETBleed: Mitigation: untrained return thunk
Dec 08 19:17:44 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 08 19:17:44 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 08 19:17:44 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 08 19:17:44 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 08 19:17:44 localhost kernel: x86/bugs: return thunk changed
Dec 08 19:17:44 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 08 19:17:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 08 19:17:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 08 19:17:44 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 08 19:17:44 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 08 19:17:44 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 08 19:17:44 localhost kernel: Freeing SMP alternatives memory: 40K
Dec 08 19:17:44 localhost kernel: pid_max: default: 32768 minimum: 301
Dec 08 19:17:44 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 08 19:17:44 localhost kernel: landlock: Up and running.
Dec 08 19:17:44 localhost kernel: Yama: becoming mindful.
Dec 08 19:17:44 localhost kernel: SELinux:  Initializing.
Dec 08 19:17:44 localhost kernel: LSM support for eBPF active
Dec 08 19:17:44 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 08 19:17:44 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 08 19:17:44 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 08 19:17:44 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 08 19:17:44 localhost kernel: ... version:                0
Dec 08 19:17:44 localhost kernel: ... bit width:              48
Dec 08 19:17:44 localhost kernel: ... generic registers:      6
Dec 08 19:17:44 localhost kernel: ... value mask:             0000ffffffffffff
Dec 08 19:17:44 localhost kernel: ... max period:             00007fffffffffff
Dec 08 19:17:44 localhost kernel: ... fixed-purpose events:   0
Dec 08 19:17:44 localhost kernel: ... event mask:             000000000000003f
Dec 08 19:17:44 localhost kernel: signal: max sigframe size: 1776
Dec 08 19:17:44 localhost kernel: rcu: Hierarchical SRCU implementation.
Dec 08 19:17:44 localhost kernel: rcu:         Max phase no-delay instances is 400.
Dec 08 19:17:44 localhost kernel: smp: Bringing up secondary CPUs ...
Dec 08 19:17:44 localhost kernel: smpboot: x86: Booting SMP configuration:
Dec 08 19:17:44 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 08 19:17:44 localhost kernel: smp: Brought up 1 node, 8 CPUs
Dec 08 19:17:44 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Dec 08 19:17:44 localhost kernel: node 0 deferred pages initialised in 11ms
Dec 08 19:17:44 localhost kernel: Memory: 7763876K/8388068K available (16384K kernel code, 5795K rwdata, 13908K rodata, 4196K init, 7156K bss, 618204K reserved, 0K cma-reserved)
Dec 08 19:17:44 localhost kernel: devtmpfs: initialized
Dec 08 19:17:44 localhost kernel: x86/mm: Memory block size: 128MB
Dec 08 19:17:44 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 08 19:17:44 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 08 19:17:44 localhost kernel: pinctrl core: initialized pinctrl subsystem
Dec 08 19:17:44 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 08 19:17:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 08 19:17:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 08 19:17:44 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 08 19:17:44 localhost kernel: audit: initializing netlink subsys (disabled)
Dec 08 19:17:44 localhost kernel: audit: type=2000 audit(1765221462.268:1): state=initialized audit_enabled=0 res=1
Dec 08 19:17:44 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 08 19:17:44 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 08 19:17:44 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 08 19:17:44 localhost kernel: cpuidle: using governor menu
Dec 08 19:17:44 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 08 19:17:44 localhost kernel: PCI: Using configuration type 1 for base access
Dec 08 19:17:44 localhost kernel: PCI: Using configuration type 1 for extended access
Dec 08 19:17:44 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 08 19:17:44 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 08 19:17:44 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 08 19:17:44 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 08 19:17:44 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 08 19:17:44 localhost kernel: Demotion targets for Node 0: null
Dec 08 19:17:44 localhost kernel: cryptd: max_cpu_qlen set to 1000
Dec 08 19:17:44 localhost kernel: ACPI: Added _OSI(Module Device)
Dec 08 19:17:44 localhost kernel: ACPI: Added _OSI(Processor Device)
Dec 08 19:17:44 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 08 19:17:44 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 08 19:17:44 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 08 19:17:44 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 08 19:17:44 localhost kernel: ACPI: Interpreter enabled
Dec 08 19:17:44 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 08 19:17:44 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Dec 08 19:17:44 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 08 19:17:44 localhost kernel: PCI: Using E820 reservations for host bridge windows
Dec 08 19:17:44 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 08 19:17:44 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 08 19:17:44 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [3] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [4] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [5] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [6] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [7] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [8] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [9] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [10] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [11] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [12] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [13] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [14] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [15] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [16] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [17] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [18] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [19] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [20] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [21] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [22] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [23] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [24] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [25] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [26] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [27] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [28] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [29] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [30] registered
Dec 08 19:17:44 localhost kernel: acpiphp: Slot [31] registered
Dec 08 19:17:44 localhost kernel: PCI host bridge to bus 0000:00
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 08 19:17:44 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 08 19:17:44 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 08 19:17:44 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 08 19:17:44 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 08 19:17:44 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 08 19:17:44 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 08 19:17:44 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 08 19:17:44 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 08 19:17:44 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 08 19:17:44 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 08 19:17:44 localhost kernel: iommu: Default domain type: Translated
Dec 08 19:17:44 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 08 19:17:44 localhost kernel: SCSI subsystem initialized
Dec 08 19:17:44 localhost kernel: ACPI: bus type USB registered
Dec 08 19:17:44 localhost kernel: usbcore: registered new interface driver usbfs
Dec 08 19:17:44 localhost kernel: usbcore: registered new interface driver hub
Dec 08 19:17:44 localhost kernel: usbcore: registered new device driver usb
Dec 08 19:17:44 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 08 19:17:44 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 08 19:17:44 localhost kernel: PTP clock support registered
Dec 08 19:17:44 localhost kernel: EDAC MC: Ver: 3.0.0
Dec 08 19:17:44 localhost kernel: NetLabel: Initializing
Dec 08 19:17:44 localhost kernel: NetLabel:  domain hash size = 128
Dec 08 19:17:44 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 08 19:17:44 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Dec 08 19:17:44 localhost kernel: PCI: Using ACPI for IRQ routing
Dec 08 19:17:44 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Dec 08 19:17:44 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Dec 08 19:17:44 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 08 19:17:44 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 08 19:17:44 localhost kernel: vgaarb: loaded
Dec 08 19:17:44 localhost kernel: clocksource: Switched to clocksource kvm-clock
Dec 08 19:17:44 localhost kernel: VFS: Disk quotas dquot_6.6.0
Dec 08 19:17:44 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 08 19:17:44 localhost kernel: pnp: PnP ACPI init
Dec 08 19:17:44 localhost kernel: pnp 00:03: [dma 2]
Dec 08 19:17:44 localhost kernel: pnp: PnP ACPI: found 5 devices
Dec 08 19:17:44 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 08 19:17:44 localhost kernel: NET: Registered PF_INET protocol family
Dec 08 19:17:44 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 08 19:17:44 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 08 19:17:44 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 08 19:17:44 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 08 19:17:44 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 08 19:17:44 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 08 19:17:44 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 08 19:17:44 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 08 19:17:44 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 08 19:17:44 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 08 19:17:44 localhost kernel: NET: Registered PF_XDP protocol family
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 08 19:17:44 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 08 19:17:44 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 08 19:17:44 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 08 19:17:44 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 89781 usecs
Dec 08 19:17:44 localhost kernel: PCI: CLS 0 bytes, default 64
Dec 08 19:17:44 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 08 19:17:44 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 08 19:17:44 localhost kernel: ACPI: bus type thunderbolt registered
Dec 08 19:17:44 localhost kernel: Trying to unpack rootfs image as initramfs...
Dec 08 19:17:44 localhost kernel: Initialise system trusted keyrings
Dec 08 19:17:44 localhost kernel: Key type blacklist registered
Dec 08 19:17:44 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 08 19:17:44 localhost kernel: zbud: loaded
Dec 08 19:17:44 localhost kernel: integrity: Platform Keyring initialized
Dec 08 19:17:44 localhost kernel: integrity: Machine keyring initialized
Dec 08 19:17:44 localhost kernel: Freeing initrd memory: 87804K
Dec 08 19:17:44 localhost kernel: NET: Registered PF_ALG protocol family
Dec 08 19:17:44 localhost kernel: xor: automatically using best checksumming function   avx       
Dec 08 19:17:44 localhost kernel: Key type asymmetric registered
Dec 08 19:17:44 localhost kernel: Asymmetric key parser 'x509' registered
Dec 08 19:17:44 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 08 19:17:44 localhost kernel: io scheduler mq-deadline registered
Dec 08 19:17:44 localhost kernel: io scheduler kyber registered
Dec 08 19:17:44 localhost kernel: io scheduler bfq registered
Dec 08 19:17:44 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 08 19:17:44 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 08 19:17:44 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 08 19:17:44 localhost kernel: ACPI: button: Power Button [PWRF]
Dec 08 19:17:44 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 08 19:17:44 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 08 19:17:44 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 08 19:17:44 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 08 19:17:44 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 08 19:17:44 localhost kernel: Non-volatile memory driver v1.3
Dec 08 19:17:44 localhost kernel: rdac: device handler registered
Dec 08 19:17:44 localhost kernel: hp_sw: device handler registered
Dec 08 19:17:44 localhost kernel: emc: device handler registered
Dec 08 19:17:44 localhost kernel: alua: device handler registered
Dec 08 19:17:44 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 08 19:17:44 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 08 19:17:44 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 08 19:17:44 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 08 19:17:44 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 08 19:17:44 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 08 19:17:44 localhost kernel: usb usb1: Product: UHCI Host Controller
Dec 08 19:17:44 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-645.el9.x86_64 uhci_hcd
Dec 08 19:17:44 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 08 19:17:44 localhost kernel: hub 1-0:1.0: USB hub found
Dec 08 19:17:44 localhost kernel: hub 1-0:1.0: 2 ports detected
Dec 08 19:17:44 localhost kernel: usbcore: registered new interface driver usbserial_generic
Dec 08 19:17:44 localhost kernel: usbserial: USB Serial support registered for generic
Dec 08 19:17:44 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 08 19:17:44 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 08 19:17:44 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 08 19:17:44 localhost kernel: mousedev: PS/2 mouse device common for all mice
Dec 08 19:17:44 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 08 19:17:44 localhost kernel: rtc_cmos 00:04: registered as rtc0
Dec 08 19:17:44 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 08 19:17:44 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-12-08T19:17:43 UTC (1765221463)
Dec 08 19:17:44 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 08 19:17:44 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 08 19:17:44 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 08 19:17:44 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 08 19:17:44 localhost kernel: usbcore: registered new interface driver usbhid
Dec 08 19:17:44 localhost kernel: usbhid: USB HID core driver
Dec 08 19:17:44 localhost kernel: drop_monitor: Initializing network drop monitor service
Dec 08 19:17:44 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 08 19:17:44 localhost kernel: Initializing XFRM netlink socket
Dec 08 19:17:44 localhost kernel: NET: Registered PF_INET6 protocol family
Dec 08 19:17:44 localhost kernel: Segment Routing with IPv6
Dec 08 19:17:44 localhost kernel: NET: Registered PF_PACKET protocol family
Dec 08 19:17:44 localhost kernel: mpls_gso: MPLS GSO support
Dec 08 19:17:44 localhost kernel: IPI shorthand broadcast: enabled
Dec 08 19:17:44 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Dec 08 19:17:44 localhost kernel: AES CTR mode by8 optimization enabled
Dec 08 19:17:44 localhost kernel: sched_clock: Marking stable (1212004042, 152610125)->(1450847271, -86233104)
Dec 08 19:17:44 localhost kernel: registered taskstats version 1
Dec 08 19:17:44 localhost kernel: Loading compiled-in X.509 certificates
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 08 19:17:44 localhost kernel: Demotion targets for Node 0: null
Dec 08 19:17:44 localhost kernel: page_owner is disabled
Dec 08 19:17:44 localhost kernel: Key type .fscrypt registered
Dec 08 19:17:44 localhost kernel: Key type fscrypt-provisioning registered
Dec 08 19:17:44 localhost kernel: Key type big_key registered
Dec 08 19:17:44 localhost kernel: Key type encrypted registered
Dec 08 19:17:44 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 08 19:17:44 localhost kernel: Loading compiled-in module X.509 certificates
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4c28336b4850d771d036b52fb2778fdb4f02f708'
Dec 08 19:17:44 localhost kernel: ima: Allocated hash algorithm: sha256
Dec 08 19:17:44 localhost kernel: ima: No architecture policies found
Dec 08 19:17:44 localhost kernel: evm: Initialising EVM extended attributes:
Dec 08 19:17:44 localhost kernel: evm: security.selinux
Dec 08 19:17:44 localhost kernel: evm: security.SMACK64 (disabled)
Dec 08 19:17:44 localhost kernel: evm: security.SMACK64EXEC (disabled)
Dec 08 19:17:44 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 08 19:17:44 localhost kernel: evm: security.SMACK64MMAP (disabled)
Dec 08 19:17:44 localhost kernel: evm: security.apparmor (disabled)
Dec 08 19:17:44 localhost kernel: evm: security.ima
Dec 08 19:17:44 localhost kernel: evm: security.capability
Dec 08 19:17:44 localhost kernel: evm: HMAC attrs: 0x1
Dec 08 19:17:44 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 08 19:17:44 localhost kernel: Running certificate verification RSA selftest
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 08 19:17:44 localhost kernel: Running certificate verification ECDSA selftest
Dec 08 19:17:44 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 08 19:17:44 localhost kernel: clk: Disabling unused clocks
Dec 08 19:17:44 localhost kernel: Freeing unused decrypted memory: 2028K
Dec 08 19:17:44 localhost kernel: Freeing unused kernel image (initmem) memory: 4196K
Dec 08 19:17:44 localhost kernel: Write protecting the kernel read-only data: 30720k
Dec 08 19:17:44 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 428K
Dec 08 19:17:44 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 08 19:17:44 localhost kernel: Run /init as init process
Dec 08 19:17:44 localhost kernel:   with arguments:
Dec 08 19:17:44 localhost kernel:     /init
Dec 08 19:17:44 localhost kernel:   with environment:
Dec 08 19:17:44 localhost kernel:     HOME=/
Dec 08 19:17:44 localhost kernel:     TERM=linux
Dec 08 19:17:44 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64
Dec 08 19:17:44 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 08 19:17:44 localhost systemd[1]: Detected virtualization kvm.
Dec 08 19:17:44 localhost systemd[1]: Detected architecture x86-64.
Dec 08 19:17:44 localhost systemd[1]: Running in initrd.
Dec 08 19:17:44 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 08 19:17:44 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 08 19:17:44 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Dec 08 19:17:44 localhost kernel: usb 1-1: Manufacturer: QEMU
Dec 08 19:17:44 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 08 19:17:44 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 08 19:17:44 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 08 19:17:44 localhost systemd[1]: No hostname configured, using default hostname.
Dec 08 19:17:44 localhost systemd[1]: Hostname set to <localhost>.
Dec 08 19:17:44 localhost systemd[1]: Initializing machine ID from VM UUID.
Dec 08 19:17:44 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Dec 08 19:17:44 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 08 19:17:44 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 08 19:17:44 localhost systemd[1]: Reached target Initrd /usr File System.
Dec 08 19:17:44 localhost systemd[1]: Reached target Local File Systems.
Dec 08 19:17:44 localhost systemd[1]: Reached target Path Units.
Dec 08 19:17:44 localhost systemd[1]: Reached target Slice Units.
Dec 08 19:17:44 localhost systemd[1]: Reached target Swaps.
Dec 08 19:17:44 localhost systemd[1]: Reached target Timer Units.
Dec 08 19:17:44 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 08 19:17:44 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Dec 08 19:17:44 localhost systemd[1]: Listening on Journal Socket.
Dec 08 19:17:44 localhost systemd[1]: Listening on udev Control Socket.
Dec 08 19:17:44 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 08 19:17:44 localhost systemd[1]: Reached target Socket Units.
Dec 08 19:17:44 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 08 19:17:44 localhost systemd[1]: Starting Journal Service...
Dec 08 19:17:44 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 08 19:17:44 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 08 19:17:44 localhost systemd[1]: Starting Create System Users...
Dec 08 19:17:44 localhost systemd[1]: Starting Setup Virtual Console...
Dec 08 19:17:44 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 08 19:17:44 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 08 19:17:44 localhost systemd[1]: Finished Create System Users.
Dec 08 19:17:44 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 08 19:17:44 localhost systemd-journald[307]: Journal started
Dec 08 19:17:44 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/58f8fcaaa5ac48a3a561edc106bffe35) is 8.0M, max 153.6M, 145.6M free.
Dec 08 19:17:44 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Dec 08 19:17:44 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Dec 08 19:17:44 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 08 19:17:44 localhost systemd[1]: Started Journal Service.
Dec 08 19:17:44 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 08 19:17:44 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 08 19:17:44 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 08 19:17:44 localhost systemd[1]: Finished Setup Virtual Console.
Dec 08 19:17:44 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 08 19:17:44 localhost systemd[1]: Starting dracut cmdline hook...
Dec 08 19:17:44 localhost dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Dec 08 19:17:44 localhost dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-645.el9.x86_64 root=UUID=fcf6b761-831a-48a7-9f5f-068b5063763f ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 08 19:17:45 localhost systemd[1]: Finished dracut cmdline hook.
Dec 08 19:17:45 localhost systemd[1]: Starting dracut pre-udev hook...
Dec 08 19:17:45 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 08 19:17:45 localhost kernel: device-mapper: uevent: version 1.0.3
Dec 08 19:17:45 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 08 19:17:45 localhost kernel: RPC: Registered named UNIX socket transport module.
Dec 08 19:17:45 localhost kernel: RPC: Registered udp transport module.
Dec 08 19:17:45 localhost kernel: RPC: Registered tcp transport module.
Dec 08 19:17:45 localhost kernel: RPC: Registered tcp-with-tls transport module.
Dec 08 19:17:45 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 08 19:17:45 localhost rpc.statd[445]: Version 2.5.4 starting
Dec 08 19:17:45 localhost rpc.statd[445]: Initializing NSM state
Dec 08 19:17:45 localhost rpc.idmapd[450]: Setting log level to 0
Dec 08 19:17:45 localhost systemd[1]: Finished dracut pre-udev hook.
Dec 08 19:17:45 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 08 19:17:45 localhost systemd-udevd[463]: Using default interface naming scheme 'rhel-9.0'.
Dec 08 19:17:45 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 08 19:17:45 localhost systemd[1]: Starting dracut pre-trigger hook...
Dec 08 19:17:45 localhost systemd[1]: Finished dracut pre-trigger hook.
Dec 08 19:17:45 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 08 19:17:45 localhost systemd[1]: Created slice Slice /system/modprobe.
Dec 08 19:17:45 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 08 19:17:45 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 08 19:17:45 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 08 19:17:45 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 08 19:17:45 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 08 19:17:45 localhost systemd[1]: Reached target Network.
Dec 08 19:17:45 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 08 19:17:45 localhost systemd[1]: Starting dracut initqueue hook...
Dec 08 19:17:45 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 08 19:17:45 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 08 19:17:45 localhost kernel:  vda: vda1
Dec 08 19:17:45 localhost kernel: libata version 3.00 loaded.
Dec 08 19:17:45 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Dec 08 19:17:45 localhost kernel: scsi host0: ata_piix
Dec 08 19:17:45 localhost kernel: scsi host1: ata_piix
Dec 08 19:17:45 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 08 19:17:45 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 08 19:17:45 localhost systemd[1]: Found device /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 08 19:17:45 localhost systemd[1]: Reached target Initrd Root Device.
Dec 08 19:17:45 localhost systemd[1]: Mounting Kernel Configuration File System...
Dec 08 19:17:45 localhost systemd[1]: Mounted Kernel Configuration File System.
Dec 08 19:17:45 localhost systemd[1]: Reached target System Initialization.
Dec 08 19:17:45 localhost systemd[1]: Reached target Basic System.
Dec 08 19:17:45 localhost kernel: ata1: found unknown device (class 0)
Dec 08 19:17:45 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 08 19:17:45 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 08 19:17:45 localhost systemd-udevd[464]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:17:45 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 08 19:17:45 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 08 19:17:45 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 08 19:17:45 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Dec 08 19:17:45 localhost systemd[1]: Finished dracut initqueue hook.
Dec 08 19:17:45 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Dec 08 19:17:45 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Dec 08 19:17:45 localhost systemd[1]: Reached target Remote File Systems.
Dec 08 19:17:45 localhost systemd[1]: Starting dracut pre-mount hook...
Dec 08 19:17:45 localhost systemd[1]: Finished dracut pre-mount hook.
Dec 08 19:17:45 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f...
Dec 08 19:17:45 localhost systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Dec 08 19:17:45 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f.
Dec 08 19:17:45 localhost systemd[1]: Mounting /sysroot...
Dec 08 19:17:46 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 08 19:17:46 localhost kernel: XFS (vda1): Mounting V5 Filesystem fcf6b761-831a-48a7-9f5f-068b5063763f
Dec 08 19:17:46 localhost kernel: XFS (vda1): Ending clean mount
Dec 08 19:17:46 localhost systemd[1]: Mounted /sysroot.
Dec 08 19:17:46 localhost systemd[1]: Reached target Initrd Root File System.
Dec 08 19:17:46 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 08 19:17:46 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 08 19:17:46 localhost systemd[1]: Reached target Initrd File Systems.
Dec 08 19:17:46 localhost systemd[1]: Reached target Initrd Default Target.
Dec 08 19:17:46 localhost systemd[1]: Starting dracut mount hook...
Dec 08 19:17:46 localhost systemd[1]: Finished dracut mount hook.
Dec 08 19:17:46 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 08 19:17:46 localhost rpc.idmapd[450]: exiting on signal 15
Dec 08 19:17:46 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 08 19:17:46 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 08 19:17:46 localhost systemd[1]: Stopped target Network.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Timer Units.
Dec 08 19:17:46 localhost systemd[1]: dbus.socket: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 08 19:17:46 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Initrd Default Target.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Basic System.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Initrd Root Device.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Initrd /usr File System.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Path Units.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Remote File Systems.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Slice Units.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Socket Units.
Dec 08 19:17:46 localhost systemd[1]: Stopped target System Initialization.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Local File Systems.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Swaps.
Dec 08 19:17:46 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut mount hook.
Dec 08 19:17:46 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut pre-mount hook.
Dec 08 19:17:46 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Dec 08 19:17:46 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 08 19:17:46 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut initqueue hook.
Dec 08 19:17:46 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Apply Kernel Variables.
Dec 08 19:17:46 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Dec 08 19:17:46 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Coldplug All udev Devices.
Dec 08 19:17:46 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut pre-trigger hook.
Dec 08 19:17:46 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 08 19:17:46 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Setup Virtual Console.
Dec 08 19:17:46 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 08 19:17:46 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 08 19:17:46 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Closed udev Control Socket.
Dec 08 19:17:46 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Closed udev Kernel Socket.
Dec 08 19:17:46 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut pre-udev hook.
Dec 08 19:17:46 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped dracut cmdline hook.
Dec 08 19:17:46 localhost systemd[1]: Starting Cleanup udev Database...
Dec 08 19:17:46 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 08 19:17:46 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Dec 08 19:17:46 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Stopped Create System Users.
Dec 08 19:17:46 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 08 19:17:46 localhost systemd[1]: Finished Cleanup udev Database.
Dec 08 19:17:46 localhost systemd[1]: Reached target Switch Root.
Dec 08 19:17:46 localhost systemd[1]: Starting Switch Root...
Dec 08 19:17:46 localhost systemd[1]: Switching root.
Dec 08 19:17:46 localhost systemd-journald[307]: Journal stopped
Dec 08 19:17:47 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Dec 08 19:17:47 localhost kernel: audit: type=1404 audit(1765221466.920:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability open_perms=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:17:47 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:17:47 localhost kernel: audit: type=1403 audit(1765221467.053:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 08 19:17:47 localhost systemd[1]: Successfully loaded SELinux policy in 136.717ms.
Dec 08 19:17:47 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.580ms.
Dec 08 19:17:47 localhost systemd[1]: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 08 19:17:47 localhost systemd[1]: Detected virtualization kvm.
Dec 08 19:17:47 localhost systemd[1]: Detected architecture x86-64.
Dec 08 19:17:47 localhost systemd-rc-local-generator[642]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:17:47 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Stopped Switch Root.
Dec 08 19:17:47 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 08 19:17:47 localhost systemd[1]: Created slice Slice /system/getty.
Dec 08 19:17:47 localhost systemd[1]: Created slice Slice /system/serial-getty.
Dec 08 19:17:47 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Dec 08 19:17:47 localhost systemd[1]: Created slice User and Session Slice.
Dec 08 19:17:47 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Dec 08 19:17:47 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Dec 08 19:17:47 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 08 19:17:47 localhost systemd[1]: Reached target Local Encrypted Volumes.
Dec 08 19:17:47 localhost systemd[1]: Stopped target Switch Root.
Dec 08 19:17:47 localhost systemd[1]: Stopped target Initrd File Systems.
Dec 08 19:17:47 localhost systemd[1]: Stopped target Initrd Root File System.
Dec 08 19:17:47 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Dec 08 19:17:47 localhost systemd[1]: Reached target Path Units.
Dec 08 19:17:47 localhost systemd[1]: Reached target rpc_pipefs.target.
Dec 08 19:17:47 localhost systemd[1]: Reached target Slice Units.
Dec 08 19:17:47 localhost systemd[1]: Reached target Swaps.
Dec 08 19:17:47 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Dec 08 19:17:47 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Dec 08 19:17:47 localhost systemd[1]: Reached target RPC Port Mapper.
Dec 08 19:17:47 localhost systemd[1]: Listening on Process Core Dump Socket.
Dec 08 19:17:47 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Dec 08 19:17:47 localhost systemd[1]: Listening on udev Control Socket.
Dec 08 19:17:47 localhost systemd[1]: Listening on udev Kernel Socket.
Dec 08 19:17:47 localhost systemd[1]: Mounting Huge Pages File System...
Dec 08 19:17:47 localhost systemd[1]: Mounting POSIX Message Queue File System...
Dec 08 19:17:47 localhost systemd[1]: Mounting Kernel Debug File System...
Dec 08 19:17:47 localhost systemd[1]: Mounting Kernel Trace File System...
Dec 08 19:17:47 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 08 19:17:47 localhost systemd[1]: Starting Create List of Static Device Nodes...
Dec 08 19:17:47 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 08 19:17:47 localhost systemd[1]: Starting Load Kernel Module drm...
Dec 08 19:17:47 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Dec 08 19:17:47 localhost systemd[1]: Starting Load Kernel Module fuse...
Dec 08 19:17:47 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 08 19:17:47 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Stopped File System Check on Root Device.
Dec 08 19:17:47 localhost systemd[1]: Stopped Journal Service.
Dec 08 19:17:47 localhost systemd[1]: Starting Journal Service...
Dec 08 19:17:47 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 08 19:17:47 localhost systemd[1]: Starting Generate network units from Kernel command line...
Dec 08 19:17:47 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 08 19:17:47 localhost kernel: fuse: init (API version 7.37)
Dec 08 19:17:47 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Dec 08 19:17:47 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 08 19:17:47 localhost systemd[1]: Starting Apply Kernel Variables...
Dec 08 19:17:47 localhost systemd[1]: Starting Coldplug All udev Devices...
Dec 08 19:17:47 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 08 19:17:47 localhost systemd[1]: Mounted Huge Pages File System.
Dec 08 19:17:47 localhost systemd-journald[683]: Journal started
Dec 08 19:17:47 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 08 19:17:47 localhost systemd[1]: Queued start job for default target Multi-User System.
Dec 08 19:17:47 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Started Journal Service.
Dec 08 19:17:47 localhost systemd[1]: Mounted POSIX Message Queue File System.
Dec 08 19:17:47 localhost systemd[1]: Mounted Kernel Debug File System.
Dec 08 19:17:47 localhost systemd[1]: Mounted Kernel Trace File System.
Dec 08 19:17:47 localhost systemd[1]: Finished Create List of Static Device Nodes.
Dec 08 19:17:47 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 08 19:17:47 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 08 19:17:47 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Finished Load Kernel Module fuse.
Dec 08 19:17:47 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 08 19:17:47 localhost systemd[1]: Finished Generate network units from Kernel command line.
Dec 08 19:17:47 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 08 19:17:47 localhost systemd[1]: Finished Apply Kernel Variables.
Dec 08 19:17:47 localhost kernel: ACPI: bus type drm_connector registered
Dec 08 19:17:47 localhost systemd[1]: Mounting FUSE Control File System...
Dec 08 19:17:47 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 08 19:17:47 localhost systemd[1]: Starting Rebuild Hardware Database...
Dec 08 19:17:47 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 08 19:17:47 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 08 19:17:47 localhost systemd[1]: Starting Load/Save OS Random Seed...
Dec 08 19:17:47 localhost systemd[1]: Starting Create System Users...
Dec 08 19:17:47 localhost systemd-journald[683]: Runtime Journal (/run/log/journal/4d4ef2323cc3337bbfd9081b2a323b4e) is 8.0M, max 153.6M, 145.6M free.
Dec 08 19:17:47 localhost systemd-journald[683]: Received client request to flush runtime journal.
Dec 08 19:17:47 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 08 19:17:47 localhost systemd[1]: Finished Load Kernel Module drm.
Dec 08 19:17:47 localhost systemd[1]: Mounted FUSE Control File System.
Dec 08 19:17:47 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 08 19:17:47 localhost systemd[1]: Finished Coldplug All udev Devices.
Dec 08 19:17:47 localhost systemd[1]: Finished Create System Users.
Dec 08 19:17:47 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 08 19:17:47 localhost systemd[1]: Finished Load/Save OS Random Seed.
Dec 08 19:17:47 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 08 19:17:47 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 08 19:17:47 localhost systemd[1]: Reached target Preparation for Local File Systems.
Dec 08 19:17:47 localhost systemd[1]: Reached target Local File Systems.
Dec 08 19:17:47 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 08 19:17:47 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 08 19:17:47 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 08 19:17:47 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 08 19:17:47 localhost systemd[1]: Starting Automatic Boot Loader Update...
Dec 08 19:17:47 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 08 19:17:47 localhost systemd[1]: Starting Create Volatile Files and Directories...
Dec 08 19:17:47 localhost bootctl[700]: Couldn't find EFI system partition, skipping.
Dec 08 19:17:47 localhost systemd[1]: Finished Automatic Boot Loader Update.
Dec 08 19:17:47 localhost systemd[1]: Finished Create Volatile Files and Directories.
Dec 08 19:17:47 localhost systemd[1]: Starting Security Auditing Service...
Dec 08 19:17:47 localhost systemd[1]: Starting RPC Bind...
Dec 08 19:17:47 localhost systemd[1]: Starting Rebuild Journal Catalog...
Dec 08 19:17:47 localhost auditd[706]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 08 19:17:47 localhost systemd[1]: Finished Rebuild Journal Catalog.
Dec 08 19:17:47 localhost auditd[706]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 08 19:17:48 localhost systemd[1]: Started RPC Bind.
Dec 08 19:17:48 localhost augenrules[711]: /sbin/augenrules: No change
Dec 08 19:17:48 localhost augenrules[726]: No rules
Dec 08 19:17:48 localhost augenrules[726]: enabled 1
Dec 08 19:17:48 localhost augenrules[726]: failure 1
Dec 08 19:17:48 localhost augenrules[726]: pid 706
Dec 08 19:17:48 localhost augenrules[726]: rate_limit 0
Dec 08 19:17:48 localhost augenrules[726]: backlog_limit 8192
Dec 08 19:17:48 localhost augenrules[726]: lost 0
Dec 08 19:17:48 localhost augenrules[726]: backlog 0
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time 60000
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 08 19:17:48 localhost augenrules[726]: enabled 1
Dec 08 19:17:48 localhost augenrules[726]: failure 1
Dec 08 19:17:48 localhost augenrules[726]: pid 706
Dec 08 19:17:48 localhost augenrules[726]: rate_limit 0
Dec 08 19:17:48 localhost augenrules[726]: backlog_limit 8192
Dec 08 19:17:48 localhost augenrules[726]: lost 0
Dec 08 19:17:48 localhost augenrules[726]: backlog 4
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time 60000
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 08 19:17:48 localhost augenrules[726]: enabled 1
Dec 08 19:17:48 localhost augenrules[726]: failure 1
Dec 08 19:17:48 localhost augenrules[726]: pid 706
Dec 08 19:17:48 localhost augenrules[726]: rate_limit 0
Dec 08 19:17:48 localhost augenrules[726]: backlog_limit 8192
Dec 08 19:17:48 localhost augenrules[726]: lost 0
Dec 08 19:17:48 localhost augenrules[726]: backlog 0
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time 60000
Dec 08 19:17:48 localhost augenrules[726]: backlog_wait_time_actual 0
Dec 08 19:17:48 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 08 19:17:48 localhost systemd[1]: Started Security Auditing Service.
Dec 08 19:17:48 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 08 19:17:48 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 08 19:17:48 localhost systemd[1]: Finished Rebuild Hardware Database.
Dec 08 19:17:48 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 08 19:17:48 localhost systemd[1]: Starting Update is Completed...
Dec 08 19:17:48 localhost systemd[1]: Finished Update is Completed.
Dec 08 19:17:48 localhost systemd-udevd[734]: Using default interface naming scheme 'rhel-9.0'.
Dec 08 19:17:48 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 08 19:17:48 localhost systemd[1]: Reached target System Initialization.
Dec 08 19:17:48 localhost systemd[1]: Started dnf makecache --timer.
Dec 08 19:17:48 localhost systemd[1]: Started Daily rotation of log files.
Dec 08 19:17:48 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 08 19:17:48 localhost systemd[1]: Reached target Timer Units.
Dec 08 19:17:48 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 08 19:17:48 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 08 19:17:48 localhost systemd[1]: Reached target Socket Units.
Dec 08 19:17:48 localhost systemd[1]: Starting D-Bus System Message Bus...
Dec 08 19:17:48 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 08 19:17:48 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 08 19:17:48 localhost systemd[1]: Starting Load Kernel Module configfs...
Dec 08 19:17:48 localhost systemd-udevd[744]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:17:48 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 08 19:17:48 localhost systemd[1]: Finished Load Kernel Module configfs.
Dec 08 19:17:48 localhost systemd[1]: Started D-Bus System Message Bus.
Dec 08 19:17:48 localhost systemd[1]: Reached target Basic System.
Dec 08 19:17:48 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 08 19:17:48 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 08 19:17:48 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 08 19:17:48 localhost dbus-broker-lau[762]: Ready
Dec 08 19:17:48 localhost systemd[1]: Starting NTP client/server...
Dec 08 19:17:48 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 08 19:17:48 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 08 19:17:48 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 08 19:17:48 localhost systemd[1]: Starting IPv4 firewall with iptables...
Dec 08 19:17:48 localhost systemd[1]: Started irqbalance daemon.
Dec 08 19:17:48 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 08 19:17:48 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 19:17:48 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 19:17:48 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 19:17:48 localhost systemd[1]: Reached target sshd-keygen.target.
Dec 08 19:17:48 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 08 19:17:48 localhost systemd[1]: Reached target User and Group Name Lookups.
Dec 08 19:17:48 localhost chronyd[792]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 08 19:17:48 localhost chronyd[792]: Loaded 0 symmetric keys
Dec 08 19:17:48 localhost chronyd[792]: Using right/UTC timezone to obtain leap second data
Dec 08 19:17:48 localhost chronyd[792]: Loaded seccomp filter (level 2)
Dec 08 19:17:48 localhost systemd[1]: Starting User Login Management...
Dec 08 19:17:48 localhost systemd[1]: Started NTP client/server.
Dec 08 19:17:48 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 08 19:17:48 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 08 19:17:48 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 08 19:17:48 localhost systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 08 19:17:48 localhost systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 08 19:17:48 localhost systemd-logind[793]: New seat seat0.
Dec 08 19:17:48 localhost systemd[1]: Started User Login Management.
Dec 08 19:17:48 localhost kernel: kvm_amd: TSC scaling supported
Dec 08 19:17:48 localhost kernel: kvm_amd: Nested Virtualization enabled
Dec 08 19:17:48 localhost kernel: kvm_amd: Nested Paging enabled
Dec 08 19:17:48 localhost kernel: kvm_amd: LBR virtualization supported
Dec 08 19:17:48 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 08 19:17:48 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 08 19:17:48 localhost kernel: Console: switching to colour dummy device 80x25
Dec 08 19:17:48 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 08 19:17:48 localhost kernel: [drm] features: -context_init
Dec 08 19:17:48 localhost kernel: [drm] number of scanouts: 1
Dec 08 19:17:48 localhost kernel: [drm] number of cap sets: 0
Dec 08 19:17:48 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 08 19:17:48 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 08 19:17:48 localhost kernel: Console: switching to colour frame buffer device 128x48
Dec 08 19:17:48 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 08 19:17:48 localhost iptables.init[782]: iptables: Applying firewall rules: [  OK  ]
Dec 08 19:17:48 localhost systemd[1]: Finished IPv4 firewall with iptables.
Dec 08 19:17:49 localhost cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Mon, 08 Dec 2025 19:17:49 +0000. Up 6.78 seconds.
Dec 08 19:17:49 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Dec 08 19:17:49 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Dec 08 19:17:49 localhost systemd[1]: run-cloud\x2dinit-tmp-tmp3uoumd5v.mount: Deactivated successfully.
Dec 08 19:17:49 localhost systemd[1]: Starting Hostname Service...
Dec 08 19:17:49 localhost systemd[1]: Started Hostname Service.
Dec 08 19:17:49 np0005550792.novalocal systemd-hostnamed[856]: Hostname set to <np0005550792.novalocal> (static)
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Reached target Preparation for Network.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Starting Network Manager...
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6300] NetworkManager (version 1.54.1-1.el9) is starting... (boot:dbd0b2df-41a2-4b72-b337-1b1fd8346088)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6305] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6374] manager[0x5572ae137080]: monitoring kernel firmware directory '/lib/firmware'.
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6485] hostname: hostname: using hostnamed
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6486] hostname: static hostname changed from (none) to "np0005550792.novalocal"
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6496] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6605] manager[0x5572ae137080]: rfkill: Wi-Fi hardware radio set enabled
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6606] manager[0x5572ae137080]: rfkill: WWAN hardware radio set enabled
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6665] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6666] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6666] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6666] manager: Networking is enabled by state file
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6669] settings: Loaded settings plugin: keyfile (internal)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6690] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6716] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6729] dhcp: init: Using DHCP client 'internal'
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6731] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6744] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6751] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6757] device (lo): Activation: starting connection 'lo' (da4f7c5b-a714-4e67-a816-06de13118f80)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6765] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6767] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6793] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6796] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6798] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6800] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6802] device (eth0): carrier: link connected
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6804] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6809] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6818] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6822] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6823] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6824] manager: NetworkManager state is now CONNECTING
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6825] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6832] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6835] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6871] dhcp4 (eth0): state changed new lease, address=38.102.83.66
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6877] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.6894] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Started Network Manager.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Reached target Network.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7113] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7116] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7124] device (lo): Activation: successful, device activated.
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7130] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7131] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7134] manager: NetworkManager state is now CONNECTED_SITE
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7135] device (eth0): Activation: successful, device activated.
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7140] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 08 19:17:49 np0005550792.novalocal NetworkManager[860]: <info>  [1765221469.7142] manager: startup complete
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Reached target NFS client services.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Reached target Remote File Systems.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 08 19:17:49 np0005550792.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: Cloud-init v. 24.4-7.el9 running 'init' at Mon, 08 Dec 2025 19:17:50 +0000. Up 7.72 seconds.
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |  eth0  | True |         38.102.83.66         | 255.255.255.0 | global | fa:16:3e:cc:4a:4f |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fecc:4a4f/64 |       .       |  link  | fa:16:3e:cc:4a:4f |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 08 19:17:50 np0005550792.novalocal cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: new group: name=cloud-user, GID=1001
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: add 'cloud-user' to group 'adm'
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: add 'cloud-user' to group 'systemd-journal'
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: add 'cloud-user' to shadow group 'adm'
Dec 08 19:17:50 np0005550792.novalocal useradd[988]: add 'cloud-user' to shadow group 'systemd-journal'
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Generating public/private rsa key pair.
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key fingerprint is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: SHA256:M8SMdVi0uQyI7juPWI1LcdsMbUXV7BUwSENL96S7RlI root@np0005550792.novalocal
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key's randomart image is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +---[RSA 3072]----+
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |        .+B*o*.o.|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |     . *.o.++ B .|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    . o = +. E o |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |   .   o + .. o  |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    o o S o. o   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |   . = * o  o .  |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    = o o    o   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |   +.+      .    |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |  . +o.          |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Generating public/private ecdsa key pair.
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key fingerprint is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: SHA256:PmZYtZ7xIy9p8/ArUNAKJFwV0Tfb5TrzVLPkxncg4RM root@np0005550792.novalocal
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key's randomart image is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +---[ECDSA 256]---+
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |   ..oo.+=       |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    ... . o E   .|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |       . o.o * o |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |        ....= ooo|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |        S.o  o+o+|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |       +.. +  +=+|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |      . =.=.o .=o|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |       o .*= .  .|
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |         . ==.   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Generating public/private ed25519 key pair.
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key fingerprint is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: SHA256:OzcDf7WkAx8oaM40Gt1+A0/FbY42BiNvTqtXWBAPbrg root@np0005550792.novalocal
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: The key's randomart image is:
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +--[ED25519 256]--+
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |          o.     |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |         o.+ .   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |        o =.+ o  |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |     . o = =.+   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    . * E Bo* +  |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |     B o %.*.* . |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |    . o + @.= .  |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |         =.= .   |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: |        ..       |
Dec 08 19:17:51 np0005550792.novalocal cloud-init[922]: +----[SHA256]-----+
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Reached target Cloud-config availability.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Reached target Network is Online.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Crash recovery kernel arming...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting System Logging Service...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting OpenSSH server daemon...
Dec 08 19:17:51 np0005550792.novalocal sm-notify[1004]: Version 2.5.4 starting
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Permit User Sessions...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started Notify NFS peers of a restart.
Dec 08 19:17:51 np0005550792.novalocal sshd[1006]: Server listening on 0.0.0.0 port 22.
Dec 08 19:17:51 np0005550792.novalocal sshd[1006]: Server listening on :: port 22.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started OpenSSH server daemon.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Finished Permit User Sessions.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started Command Scheduler.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started Getty on tty1.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started Serial Getty on ttyS0.
Dec 08 19:17:51 np0005550792.novalocal crond[1009]: (CRON) STARTUP (1.5.7)
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Reached target Login Prompts.
Dec 08 19:17:51 np0005550792.novalocal crond[1009]: (CRON) INFO (Syslog will be used instead of sendmail.)
Dec 08 19:17:51 np0005550792.novalocal crond[1009]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 18% if used.)
Dec 08 19:17:51 np0005550792.novalocal crond[1009]: (CRON) INFO (running with inotify support)
Dec 08 19:17:51 np0005550792.novalocal rsyslogd[1005]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1005" x-info="https://www.rsyslog.com"] start
Dec 08 19:17:51 np0005550792.novalocal rsyslogd[1005]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Started System Logging Service.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Reached target Multi-User System.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 08 19:17:51 np0005550792.novalocal rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 19:17:51 np0005550792.novalocal kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Dec 08 19:17:51 np0005550792.novalocal kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-645.el9.x86_64kdump.img
Dec 08 19:17:51 np0005550792.novalocal cloud-init[1139]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Mon, 08 Dec 2025 19:17:51 +0000. Up 9.29 seconds.
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1145]: Unable to negotiate with 38.102.83.114 port 43600: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1160]: Unable to negotiate with 38.102.83.114 port 43628: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1169]: Unable to negotiate with 38.102.83.114 port 43632: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1181]: Connection reset by 38.102.83.114 port 43642 [preauth]
Dec 08 19:17:51 np0005550792.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1118]: Connection closed by 38.102.83.114 port 51738 [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1199]: Connection reset by 38.102.83.114 port 43648 [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1215]: Unable to negotiate with 38.102.83.114 port 43656: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1149]: Connection closed by 38.102.83.114 port 43616 [preauth]
Dec 08 19:17:51 np0005550792.novalocal sshd-session[1229]: Unable to negotiate with 38.102.83.114 port 43660: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Dec 08 19:17:51 np0005550792.novalocal dracut[1283]: dracut-057-102.git20250818.el9
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1301]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Mon, 08 Dec 2025 19:17:52 +0000. Up 9.70 seconds.
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1312]: #############################################################
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1314]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/fcf6b761-831a-48a7-9f5f-068b5063763f /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-645.el9.x86_64kdump.img 5.14.0-645.el9.x86_64
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1321]: 256 SHA256:PmZYtZ7xIy9p8/ArUNAKJFwV0Tfb5TrzVLPkxncg4RM root@np0005550792.novalocal (ECDSA)
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1327]: 256 SHA256:OzcDf7WkAx8oaM40Gt1+A0/FbY42BiNvTqtXWBAPbrg root@np0005550792.novalocal (ED25519)
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1331]: 3072 SHA256:M8SMdVi0uQyI7juPWI1LcdsMbUXV7BUwSENL96S7RlI root@np0005550792.novalocal (RSA)
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1333]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1339]: #############################################################
Dec 08 19:17:52 np0005550792.novalocal cloud-init[1301]: Cloud-init v. 24.4-7.el9 finished at Mon, 08 Dec 2025 19:17:52 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.88 seconds
Dec 08 19:17:52 np0005550792.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Dec 08 19:17:52 np0005550792.novalocal systemd[1]: Reached target Cloud-init target.
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 08 19:17:52 np0005550792.novalocal dracut[1285]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: Module 'resume' will not be installed, because it's in the list to be omitted!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: memstrack is not available
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: memstrack is not available
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: *** Including module: systemd ***
Dec 08 19:17:53 np0005550792.novalocal dracut[1285]: *** Including module: fips ***
Dec 08 19:17:54 np0005550792.novalocal dracut[1285]: *** Including module: systemd-initrd ***
Dec 08 19:17:54 np0005550792.novalocal dracut[1285]: *** Including module: i18n ***
Dec 08 19:17:54 np0005550792.novalocal dracut[1285]: *** Including module: drm ***
Dec 08 19:17:54 np0005550792.novalocal chronyd[792]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Dec 08 19:17:54 np0005550792.novalocal chronyd[792]: System clock TAI offset set to 37 seconds
Dec 08 19:17:54 np0005550792.novalocal dracut[1285]: *** Including module: prefixdevname ***
Dec 08 19:17:54 np0005550792.novalocal dracut[1285]: *** Including module: kernel-modules ***
Dec 08 19:17:54 np0005550792.novalocal kernel: block vda: the capability attribute has been deprecated.
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: kernel-modules-extra ***
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: qemu ***
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: fstab-sys ***
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: rootfs-block ***
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: terminfo ***
Dec 08 19:17:55 np0005550792.novalocal dracut[1285]: *** Including module: udev-rules ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: Skipping udev rule: 91-permissions.rules
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: virtiofs ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: dracut-systemd ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: usrmount ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: base ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: fs-lib ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: kdumpbase ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:   microcode_ctl module: mangling fw_dir
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel" is ignored
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 08 19:17:56 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]: *** Including module: openssl ***
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]: *** Including module: shutdown ***
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]: *** Including module: squash ***
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]: *** Including modules done ***
Dec 08 19:17:57 np0005550792.novalocal dracut[1285]: *** Installing kernel module dependencies ***
Dec 08 19:17:58 np0005550792.novalocal dracut[1285]: *** Installing kernel module dependencies done ***
Dec 08 19:17:58 np0005550792.novalocal dracut[1285]: *** Resolving executable dependencies ***
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 25 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 25 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 31 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 28 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 32 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 30 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 08 19:17:59 np0005550792.novalocal irqbalance[785]: IRQ 29 affinity is now unmanaged
Dec 08 19:17:59 np0005550792.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:17:59 np0005550792.novalocal dracut[1285]: *** Resolving executable dependencies done ***
Dec 08 19:17:59 np0005550792.novalocal dracut[1285]: *** Generating early-microcode cpio image ***
Dec 08 19:17:59 np0005550792.novalocal dracut[1285]: *** Store current command line parameters ***
Dec 08 19:17:59 np0005550792.novalocal dracut[1285]: Stored kernel commandline:
Dec 08 19:17:59 np0005550792.novalocal dracut[1285]: No dracut internal kernel commandline stored in the initramfs
Dec 08 19:18:00 np0005550792.novalocal dracut[1285]: *** Install squash loader ***
Dec 08 19:18:00 np0005550792.novalocal dracut[1285]: *** Squashing the files inside the initramfs ***
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: *** Squashing the files inside the initramfs done ***
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: *** Creating image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' ***
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: *** Hardlinking files ***
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Mode:           real
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Files:          50
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Linked:         0 files
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Compared:       0 xattrs
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Compared:       0 files
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Saved:          0 B
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: Duration:       0.000519 seconds
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: *** Hardlinking files done ***
Dec 08 19:18:02 np0005550792.novalocal dracut[1285]: *** Creating initramfs image file '/boot/initramfs-5.14.0-645.el9.x86_64kdump.img' done ***
Dec 08 19:18:02 np0005550792.novalocal kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Dec 08 19:18:02 np0005550792.novalocal kdumpctl[1018]: kdump: Starting kdump: [OK]
Dec 08 19:18:02 np0005550792.novalocal systemd[1]: Finished Crash recovery kernel arming.
Dec 08 19:18:02 np0005550792.novalocal systemd[1]: Startup finished in 1.529s (kernel) + 3.052s (initrd) + 15.974s (userspace) = 20.557s.
Dec 08 19:18:14 np0005550792.novalocal sshd-session[4295]: Accepted publickey for zuul from 38.102.83.114 port 54344 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Dec 08 19:18:14 np0005550792.novalocal systemd[1]: Created slice User Slice of UID 1000.
Dec 08 19:18:14 np0005550792.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 08 19:18:14 np0005550792.novalocal systemd-logind[793]: New session 1 of user zuul.
Dec 08 19:18:14 np0005550792.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 08 19:18:14 np0005550792.novalocal systemd[1]: Starting User Manager for UID 1000...
Dec 08 19:18:14 np0005550792.novalocal systemd[4299]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Queued start job for default target Main User Target.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Created slice User Application Slice.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Started Daily Cleanup of User's Temporary Directories.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Reached target Paths.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Reached target Timers.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Starting D-Bus User Message Bus Socket...
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Starting Create User's Volatile Files and Directories...
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Finished Create User's Volatile Files and Directories.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Listening on D-Bus User Message Bus Socket.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Reached target Sockets.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Reached target Basic System.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Reached target Main User Target.
Dec 08 19:18:15 np0005550792.novalocal systemd[4299]: Startup finished in 138ms.
Dec 08 19:18:15 np0005550792.novalocal systemd[1]: Started User Manager for UID 1000.
Dec 08 19:18:15 np0005550792.novalocal systemd[1]: Started Session 1 of User zuul.
Dec 08 19:18:15 np0005550792.novalocal sshd-session[4295]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:18:15 np0005550792.novalocal python3[4381]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:18:18 np0005550792.novalocal python3[4409]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:18:19 np0005550792.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 08 19:18:24 np0005550792.novalocal python3[4469]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:18:25 np0005550792.novalocal python3[4509]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 08 19:18:27 np0005550792.novalocal python3[4535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDdR5iPEnFy7Ct2ji7LyIw+JF4yTJfEi8xzyBIN3VFW6+kpqZMG9gfHEBVB6lLOY9BCny3XjWHGMKhxAR9/Bx7a1Vph61D/1VdSYwnL58SFwqvi9VqU7X7+xEcswzvY6j+152R6XjQDDya8h500TTil/11eVGNZyrySXoZVVIISOC5XwkfDYafCbNiaKgKCovPZWBCzNzAnut7ky4Ffqr7+nDrz5uytOleHhJpLcijC86TSEzK77Kuq+/WvbkQfgUbMGJr6fx6t8snkF0/vU4yMX0bV/eIn+TrWt/OQxg38JsgLUjJF3koLGCYs1cgVQVJThoucvruLSJDI5wyDTBVkdKkIRNjK5AZN1dKqQ1Vr+DtG4s9aNwtfhJLyyczImlEk7hoZbrry+P+cgMk1Lq/QHg058eaekgVSss2w5i20mX088iSubIJ5g6seZCDCe03mvDzVZwhuxbDZKQcgkiwF4wtHA8jtfPBoEbpT8wIGqqhtTEfrV6ldTt8G0GTCnIE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:27 np0005550792.novalocal python3[4559]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:28 np0005550792.novalocal python3[4658]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:28 np0005550792.novalocal python3[4729]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765221507.7891748-207-116902732012655/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=ec250d31b8e14292bd6669edca3e6a83_id_rsa follow=False checksum=78bf8abc018c313916ccb8736e3c5486ec29c879 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:28 np0005550792.novalocal python3[4852]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:29 np0005550792.novalocal python3[4923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765221508.6909473-240-136331213483930/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=ec250d31b8e14292bd6669edca3e6a83_id_rsa.pub follow=False checksum=2c2d371927d5c6ebb6fe7c1279899349f2d61622 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:30 np0005550792.novalocal python3[4971]: ansible-ping Invoked with data=pong
Dec 08 19:18:31 np0005550792.novalocal python3[4995]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:18:34 np0005550792.novalocal python3[5053]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 08 19:18:35 np0005550792.novalocal python3[5085]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:35 np0005550792.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:35 np0005550792.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:36 np0005550792.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:36 np0005550792.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:36 np0005550792.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:37 np0005550792.novalocal sudo[5229]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsibexgcsjwkfcnrrcbblxpwudhndark ; /usr/bin/python3'
Dec 08 19:18:37 np0005550792.novalocal sudo[5229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:38 np0005550792.novalocal python3[5231]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:38 np0005550792.novalocal sudo[5229]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:38 np0005550792.novalocal sudo[5307]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hzryduqqiewpfxtbtlzvedxfaztesewx ; /usr/bin/python3'
Dec 08 19:18:38 np0005550792.novalocal sudo[5307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:38 np0005550792.novalocal python3[5309]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:38 np0005550792.novalocal sudo[5307]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:38 np0005550792.novalocal sudo[5380]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxzfdwrllhxlkfctedvzdjnsawwdgnin ; /usr/bin/python3'
Dec 08 19:18:38 np0005550792.novalocal sudo[5380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:39 np0005550792.novalocal python3[5382]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765221518.2542462-21-267821217474409/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:39 np0005550792.novalocal sudo[5380]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:39 np0005550792.novalocal python3[5430]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:40 np0005550792.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:40 np0005550792.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:40 np0005550792.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:40 np0005550792.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:41 np0005550792.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:41 np0005550792.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:41 np0005550792.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:42 np0005550792.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:42 np0005550792.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:42 np0005550792.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:42 np0005550792.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:43 np0005550792.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:43 np0005550792.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:43 np0005550792.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:43 np0005550792.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:44 np0005550792.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:44 np0005550792.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:44 np0005550792.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:45 np0005550792.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:45 np0005550792.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:45 np0005550792.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:45 np0005550792.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:46 np0005550792.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:46 np0005550792.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:46 np0005550792.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:18:49 np0005550792.novalocal sudo[6054]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkhlfeujqrncqhraetlxninqmixbpnde ; /usr/bin/python3'
Dec 08 19:18:49 np0005550792.novalocal sudo[6054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:49 np0005550792.novalocal python3[6056]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 08 19:18:49 np0005550792.novalocal systemd[1]: Starting Time & Date Service...
Dec 08 19:18:49 np0005550792.novalocal systemd[1]: Started Time & Date Service.
Dec 08 19:18:49 np0005550792.novalocal systemd-timedated[6058]: Changed time zone to 'UTC' (UTC).
Dec 08 19:18:49 np0005550792.novalocal sudo[6054]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:50 np0005550792.novalocal sudo[6085]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pridtjyovqhmdyktyxmdygranvfueyic ; /usr/bin/python3'
Dec 08 19:18:50 np0005550792.novalocal sudo[6085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:50 np0005550792.novalocal python3[6087]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:50 np0005550792.novalocal sudo[6085]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:51 np0005550792.novalocal python3[6163]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:51 np0005550792.novalocal python3[6234]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765221531.126863-153-6196052173709/source _original_basename=tmpe__9j523 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:52 np0005550792.novalocal python3[6334]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:52 np0005550792.novalocal python3[6405]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765221531.9299748-183-182147147466618/source _original_basename=tmp4ruuei1_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:53 np0005550792.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmsnhskozxpiaaeycabhnodhaumuxjgs ; /usr/bin/python3'
Dec 08 19:18:53 np0005550792.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:53 np0005550792.novalocal python3[6507]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:53 np0005550792.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:53 np0005550792.novalocal sudo[6578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igdftogwtiganqzfcoaqpobpabdvkhdk ; /usr/bin/python3'
Dec 08 19:18:53 np0005550792.novalocal sudo[6578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:53 np0005550792.novalocal python3[6580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765221532.9955275-231-221378970603468/source _original_basename=tmpat8gdv8e follow=False checksum=a972263c83cf44cabc7754859f6611771d0cc68d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:53 np0005550792.novalocal sudo[6578]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:54 np0005550792.novalocal python3[6628]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:18:54 np0005550792.novalocal python3[6654]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:18:54 np0005550792.novalocal sudo[6732]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txsakmksfarajqziosnaydoflrfilyub ; /usr/bin/python3'
Dec 08 19:18:54 np0005550792.novalocal sudo[6732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:54 np0005550792.novalocal python3[6734]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:18:54 np0005550792.novalocal sudo[6732]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:55 np0005550792.novalocal sudo[6805]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjqunklusqvuarsgtdshcammfyyppbiz ; /usr/bin/python3'
Dec 08 19:18:55 np0005550792.novalocal sudo[6805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:55 np0005550792.novalocal python3[6807]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765221534.6217203-273-179576489379768/source _original_basename=tmplqx0emuc follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:18:55 np0005550792.novalocal sudo[6805]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:55 np0005550792.novalocal sudo[6856]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnslkwwpphfhsuxfimjssiqrsptzwzia ; /usr/bin/python3'
Dec 08 19:18:55 np0005550792.novalocal sudo[6856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:18:55 np0005550792.novalocal python3[6858]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-6539-5039-00000000001d-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:18:55 np0005550792.novalocal sudo[6856]: pam_unix(sudo:session): session closed for user root
Dec 08 19:18:56 np0005550792.novalocal python3[6886]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-6539-5039-00000000001e-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 08 19:18:57 np0005550792.novalocal python3[6915]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:19:16 np0005550792.novalocal sudo[6939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koznyblpaasjzlerqvsqwvlnirksxzjr ; /usr/bin/python3'
Dec 08 19:19:16 np0005550792.novalocal sudo[6939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:19:16 np0005550792.novalocal python3[6941]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:19:16 np0005550792.novalocal sudo[6939]: pam_unix(sudo:session): session closed for user root
Dec 08 19:19:19 np0005550792.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 08 19:19:59 np0005550792.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 08 19:19:59 np0005550792.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1276] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 08 19:19:59 np0005550792.novalocal systemd-udevd[6945]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1477] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1508] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1513] device (eth1): carrier: link connected
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1515] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1521] policy: auto-activating connection 'Wired connection 1' (ce215099-8d27-3935-a760-83a6a9e7b4be)
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1525] device (eth1): Activation: starting connection 'Wired connection 1' (ce215099-8d27-3935-a760-83a6a9e7b4be)
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1526] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1530] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1534] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:19:59 np0005550792.novalocal NetworkManager[860]: <info>  [1765221599.1538] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:19:59 np0005550792.novalocal python3[6971]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-878d-c2cd-0000000000fc-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:20:06 np0005550792.novalocal sudo[7049]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knhblarlzaphjclqmfxkrgkwtcjunmjs ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 08 19:20:06 np0005550792.novalocal sudo[7049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:20:06 np0005550792.novalocal python3[7051]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:20:06 np0005550792.novalocal sudo[7049]: pam_unix(sudo:session): session closed for user root
Dec 08 19:20:07 np0005550792.novalocal sudo[7122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgztsdojjavobxiaekbtylujirixhuyf ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 08 19:20:07 np0005550792.novalocal sudo[7122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:20:07 np0005550792.novalocal python3[7124]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765221606.5204651-102-163222468458258/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=85edfdfea8c68ea2f6339f8f1e720fb20592a92c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:20:07 np0005550792.novalocal sudo[7122]: pam_unix(sudo:session): session closed for user root
Dec 08 19:20:07 np0005550792.novalocal sudo[7172]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjitaokvvhlzqhuojinpbxycbkimddrj ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 08 19:20:07 np0005550792.novalocal sudo[7172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:20:08 np0005550792.novalocal python3[7174]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Stopped Network Manager Wait Online.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Stopping Network Manager Wait Online...
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1232] caught SIGTERM, shutting down normally.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Stopping Network Manager...
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1241] dhcp4 (eth0): canceled DHCP transaction
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1241] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1241] dhcp4 (eth0): state changed no lease
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1244] manager: NetworkManager state is now CONNECTING
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1329] dhcp4 (eth1): canceled DHCP transaction
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1329] dhcp4 (eth1): state changed no lease
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[860]: <info>  [1765221608.1381] exiting (success)
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Stopped Network Manager.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: NetworkManager.service: Consumed 1.033s CPU time, 10.0M memory peak.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Starting Network Manager...
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.1875] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:dbd0b2df-41a2-4b72-b337-1b1fd8346088)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.1878] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.1937] manager[0x564528ea1070]: monitoring kernel firmware directory '/lib/firmware'.
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Starting Hostname Service...
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Started Hostname Service.
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2848] hostname: hostname: using hostnamed
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2849] hostname: static hostname changed from (none) to "np0005550792.novalocal"
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2859] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2866] manager[0x564528ea1070]: rfkill: Wi-Fi hardware radio set enabled
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2866] manager[0x564528ea1070]: rfkill: WWAN hardware radio set enabled
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2901] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2901] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2902] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2903] manager: Networking is enabled by state file
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2906] settings: Loaded settings plugin: keyfile (internal)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2910] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2938] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2950] dhcp: init: Using DHCP client 'internal'
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2954] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2960] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2966] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2975] device (lo): Activation: starting connection 'lo' (da4f7c5b-a714-4e67-a816-06de13118f80)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2984] device (eth0): carrier: link connected
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2992] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2998] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.2999] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3007] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3016] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3024] device (eth1): carrier: link connected
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3029] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3035] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (ce215099-8d27-3935-a760-83a6a9e7b4be) (indicated)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3036] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3043] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3051] device (eth1): Activation: starting connection 'Wired connection 1' (ce215099-8d27-3935-a760-83a6a9e7b4be)
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Started Network Manager.
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3089] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3100] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3104] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3108] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3112] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3117] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3122] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3127] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3131] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3140] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3143] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3151] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3154] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3166] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3171] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3176] device (lo): Activation: successful, device activated.
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3182] dhcp4 (eth0): state changed new lease, address=38.102.83.66
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3189] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3246] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal systemd[1]: Starting Network Manager Wait Online...
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3270] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3271] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3274] manager: NetworkManager state is now CONNECTED_SITE
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3277] device (eth0): Activation: successful, device activated.
Dec 08 19:20:08 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221608.3282] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 08 19:20:08 np0005550792.novalocal sudo[7172]: pam_unix(sudo:session): session closed for user root
Dec 08 19:20:08 np0005550792.novalocal python3[7258]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-878d-c2cd-0000000000a7-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:20:18 np0005550792.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:20:38 np0005550792.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3366] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 08 19:20:53 np0005550792.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:20:53 np0005550792.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3645] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3648] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3667] device (eth1): Activation: successful, device activated.
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3676] manager: startup complete
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3683] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <warn>  [1765221653.3695] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3717] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal systemd[1]: Finished Network Manager Wait Online.
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3850] dhcp4 (eth1): canceled DHCP transaction
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3851] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3851] dhcp4 (eth1): state changed no lease
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3876] policy: auto-activating connection 'ci-private-network' (707dcd72-95a2-510c-83a0-f85b3c0b91a1)
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3883] device (eth1): Activation: starting connection 'ci-private-network' (707dcd72-95a2-510c-83a0-f85b3c0b91a1)
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3884] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3888] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3898] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3911] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3957] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3960] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:20:53 np0005550792.novalocal NetworkManager[7186]: <info>  [1765221653.3966] device (eth1): Activation: successful, device activated.
Dec 08 19:21:03 np0005550792.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:21:04 np0005550792.novalocal sudo[7361]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-coppnffqqoqvkdeauiuegwspezbsgfnx ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 08 19:21:04 np0005550792.novalocal sudo[7361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:21:04 np0005550792.novalocal python3[7363]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:21:04 np0005550792.novalocal sudo[7361]: pam_unix(sudo:session): session closed for user root
Dec 08 19:21:05 np0005550792.novalocal sudo[7434]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ribdhgvzvdzpdslopggxxvwaazfwvlsp ; OS_CLOUD=vexxhost /usr/bin/python3'
Dec 08 19:21:05 np0005550792.novalocal sudo[7434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:21:05 np0005550792.novalocal python3[7436]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765221664.5754204-259-136869599457084/source _original_basename=tmp5ireoqi8 follow=False checksum=fef23bec704a30a49cee05d2b4917cdce9b70ddc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:21:05 np0005550792.novalocal sudo[7434]: pam_unix(sudo:session): session closed for user root
Dec 08 19:21:09 np0005550792.novalocal systemd[4299]: Starting Mark boot as successful...
Dec 08 19:21:09 np0005550792.novalocal systemd[4299]: Finished Mark boot as successful.
Dec 08 19:22:05 np0005550792.novalocal sshd-session[4308]: Received disconnect from 38.102.83.114 port 54344:11: disconnected by user
Dec 08 19:22:05 np0005550792.novalocal sshd-session[4308]: Disconnected from user zuul 38.102.83.114 port 54344
Dec 08 19:22:05 np0005550792.novalocal sshd-session[4295]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:22:05 np0005550792.novalocal systemd-logind[793]: Session 1 logged out. Waiting for processes to exit.
Dec 08 19:23:36 np0005550792.novalocal sshd-session[7463]: Received disconnect from 193.46.255.7 port 53642:11:  [preauth]
Dec 08 19:23:36 np0005550792.novalocal sshd-session[7463]: Disconnected from authenticating user root 193.46.255.7 port 53642 [preauth]
Dec 08 19:24:09 np0005550792.novalocal systemd[4299]: Created slice User Background Tasks Slice.
Dec 08 19:24:09 np0005550792.novalocal systemd[4299]: Starting Cleanup of User's Temporary Files and Directories...
Dec 08 19:24:09 np0005550792.novalocal systemd[4299]: Finished Cleanup of User's Temporary Files and Directories.
Dec 08 19:24:28 np0005550792.novalocal sshd-session[7468]: Invalid user soporte from 172.190.42.55 port 55590
Dec 08 19:24:28 np0005550792.novalocal sshd-session[7468]: Received disconnect from 172.190.42.55 port 55590:11: Bye Bye [preauth]
Dec 08 19:24:28 np0005550792.novalocal sshd-session[7468]: Disconnected from invalid user soporte 172.190.42.55 port 55590 [preauth]
Dec 08 19:25:02 np0005550792.novalocal sshd-session[7471]: Invalid user httpd from 45.78.228.32 port 34812
Dec 08 19:25:02 np0005550792.novalocal sshd-session[7471]: Received disconnect from 45.78.228.32 port 34812:11: Bye Bye [preauth]
Dec 08 19:25:02 np0005550792.novalocal sshd-session[7471]: Disconnected from invalid user httpd 45.78.228.32 port 34812 [preauth]
Dec 08 19:25:21 np0005550792.novalocal sshd-session[7473]: Received disconnect from 159.223.8.81 port 39708:11: Bye Bye [preauth]
Dec 08 19:25:21 np0005550792.novalocal sshd-session[7473]: Disconnected from authenticating user root 159.223.8.81 port 39708 [preauth]
Dec 08 19:26:41 np0005550792.novalocal sshd-session[7477]: Accepted publickey for zuul from 38.102.83.114 port 58390 ssh2: RSA SHA256:bK2Mc9f2rA3L0c6VdnlzeF3oH14XpBgPr+Tkg7h4pNY
Dec 08 19:26:41 np0005550792.novalocal systemd-logind[793]: New session 3 of user zuul.
Dec 08 19:26:41 np0005550792.novalocal systemd[1]: Started Session 3 of User zuul.
Dec 08 19:26:41 np0005550792.novalocal sshd-session[7477]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:26:41 np0005550792.novalocal sudo[7504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brztewmfwaydqqkbbpozhkjsdmvwyzzj ; /usr/bin/python3'
Dec 08 19:26:41 np0005550792.novalocal sudo[7504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:42 np0005550792.novalocal python3[7506]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-1416-796b-000000001eff-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:42 np0005550792.novalocal sudo[7504]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:42 np0005550792.novalocal sudo[7532]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcmypuhpywtxevrxlfvlukznphhzlydm ; /usr/bin/python3'
Dec 08 19:26:42 np0005550792.novalocal sudo[7532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:42 np0005550792.novalocal python3[7534]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:42 np0005550792.novalocal sudo[7532]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:42 np0005550792.novalocal sudo[7558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vytffvihyzkncukhqitgjfvoblcsiaet ; /usr/bin/python3'
Dec 08 19:26:42 np0005550792.novalocal sudo[7558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:42 np0005550792.novalocal python3[7560]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:42 np0005550792.novalocal sudo[7558]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:42 np0005550792.novalocal sudo[7585]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkblvvjlerducskdzigkizgtqizwfdol ; /usr/bin/python3'
Dec 08 19:26:42 np0005550792.novalocal sudo[7585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:43 np0005550792.novalocal python3[7587]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:43 np0005550792.novalocal sudo[7585]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:43 np0005550792.novalocal sudo[7611]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeygfudsxhdesywqbuoauamxzoouxifg ; /usr/bin/python3'
Dec 08 19:26:43 np0005550792.novalocal sudo[7611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:43 np0005550792.novalocal python3[7613]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:43 np0005550792.novalocal sudo[7611]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:43 np0005550792.novalocal sudo[7637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdwszblwppktwmveedkaqtpsoxafcdmx ; /usr/bin/python3'
Dec 08 19:26:43 np0005550792.novalocal sudo[7637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:43 np0005550792.novalocal python3[7639]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:43 np0005550792.novalocal sudo[7637]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:44 np0005550792.novalocal sudo[7715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsofgmjosmdwrqlpiyyuyytaxjojbwtn ; /usr/bin/python3'
Dec 08 19:26:44 np0005550792.novalocal sudo[7715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:44 np0005550792.novalocal python3[7717]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:26:44 np0005550792.novalocal sudo[7715]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:44 np0005550792.novalocal sudo[7788]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrxjujkcgceofhthbphisinjcqwxzgmo ; /usr/bin/python3'
Dec 08 19:26:44 np0005550792.novalocal sudo[7788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:44 np0005550792.novalocal python3[7790]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765222003.8799343-476-278549650269952/source _original_basename=tmp0mlf3v6x follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:26:44 np0005550792.novalocal sudo[7788]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:45 np0005550792.novalocal sudo[7838]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqnjjieopmgebynytbjcudsgcvcezleh ; /usr/bin/python3'
Dec 08 19:26:45 np0005550792.novalocal sudo[7838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:45 np0005550792.novalocal python3[7840]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 19:26:45 np0005550792.novalocal systemd[1]: Reloading.
Dec 08 19:26:45 np0005550792.novalocal systemd-rc-local-generator[7859]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:26:45 np0005550792.novalocal sudo[7838]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:47 np0005550792.novalocal sudo[7894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxiugjtkcsvroifnxkqrfzigvmjdyrpr ; /usr/bin/python3'
Dec 08 19:26:47 np0005550792.novalocal sudo[7894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:47 np0005550792.novalocal python3[7896]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 08 19:26:47 np0005550792.novalocal sudo[7894]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:47 np0005550792.novalocal sudo[7920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhcsfljnykcxvlupxdklnuxenctkovdw ; /usr/bin/python3'
Dec 08 19:26:47 np0005550792.novalocal sudo[7920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:47 np0005550792.novalocal python3[7922]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:47 np0005550792.novalocal sudo[7920]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:47 np0005550792.novalocal sudo[7948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhyjfvpckntymrksbtxwlllzuhhfynwf ; /usr/bin/python3'
Dec 08 19:26:47 np0005550792.novalocal sudo[7948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:47 np0005550792.novalocal python3[7950]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:47 np0005550792.novalocal sudo[7948]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:48 np0005550792.novalocal sudo[7976]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nduyppodnonjsmqlwqmwkwgfbwljqivx ; /usr/bin/python3'
Dec 08 19:26:48 np0005550792.novalocal sudo[7976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:48 np0005550792.novalocal python3[7978]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:48 np0005550792.novalocal sudo[7976]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:48 np0005550792.novalocal sudo[8004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwnqhgmhjsfowzpttvirdexshxrihhrq ; /usr/bin/python3'
Dec 08 19:26:48 np0005550792.novalocal sudo[8004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:48 np0005550792.novalocal python3[8006]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:48 np0005550792.novalocal sudo[8004]: pam_unix(sudo:session): session closed for user root
Dec 08 19:26:49 np0005550792.novalocal python3[8033]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-1416-796b-000000001f06-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:26:49 np0005550792.novalocal python3[8063]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 08 19:26:51 np0005550792.novalocal sshd-session[7480]: Connection closed by 38.102.83.114 port 58390
Dec 08 19:26:51 np0005550792.novalocal sshd-session[7477]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:26:51 np0005550792.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Dec 08 19:26:51 np0005550792.novalocal systemd[1]: session-3.scope: Consumed 4.300s CPU time.
Dec 08 19:26:51 np0005550792.novalocal systemd-logind[793]: Session 3 logged out. Waiting for processes to exit.
Dec 08 19:26:51 np0005550792.novalocal systemd-logind[793]: Removed session 3.
Dec 08 19:26:53 np0005550792.novalocal sshd-session[8068]: Accepted publickey for zuul from 38.102.83.114 port 35934 ssh2: RSA SHA256:bK2Mc9f2rA3L0c6VdnlzeF3oH14XpBgPr+Tkg7h4pNY
Dec 08 19:26:53 np0005550792.novalocal systemd-logind[793]: New session 4 of user zuul.
Dec 08 19:26:53 np0005550792.novalocal systemd[1]: Started Session 4 of User zuul.
Dec 08 19:26:53 np0005550792.novalocal sshd-session[8068]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:26:53 np0005550792.novalocal sudo[8095]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzuhzzummdgmfiqjezchmaczutjaoros ; /usr/bin/python3'
Dec 08 19:26:53 np0005550792.novalocal sudo[8095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:26:53 np0005550792.novalocal python3[8097]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 08 19:27:07 np0005550792.novalocal sshd-session[8159]: Received disconnect from 192.76.153.91 port 58392:11: Bye Bye [preauth]
Dec 08 19:27:07 np0005550792.novalocal sshd-session[8159]: Disconnected from authenticating user root 192.76.153.91 port 58392 [preauth]
Dec 08 19:27:11 np0005550792.novalocal sshd-session[8178]: Received disconnect from 101.47.160.247 port 45408:11: Bye Bye [preauth]
Dec 08 19:27:11 np0005550792.novalocal sshd-session[8178]: Disconnected from authenticating user root 101.47.160.247 port 45408 [preauth]
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:27:24 np0005550792.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:27:33 np0005550792.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  Converting 385 SID table entries...
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:27:42 np0005550792.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:27:43 np0005550792.novalocal setsebool[8239]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 08 19:27:43 np0005550792.novalocal setsebool[8239]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  Converting 388 SID table entries...
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability open_perms=1
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:27:54 np0005550792.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:27:56 np0005550792.novalocal sshd-session[8953]: Invalid user ubuntu from 172.190.42.55 port 36178
Dec 08 19:27:57 np0005550792.novalocal sshd-session[8953]: Received disconnect from 172.190.42.55 port 36178:11: Bye Bye [preauth]
Dec 08 19:27:57 np0005550792.novalocal sshd-session[8953]: Disconnected from invalid user ubuntu 172.190.42.55 port 36178 [preauth]
Dec 08 19:28:11 np0005550792.novalocal dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 08 19:28:11 np0005550792.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:28:11 np0005550792.novalocal systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:28:11 np0005550792.novalocal systemd[1]: Reloading.
Dec 08 19:28:12 np0005550792.novalocal systemd-rc-local-generator[8997]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:28:12 np0005550792.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:28:13 np0005550792.novalocal sudo[8095]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:19 np0005550792.novalocal irqbalance[785]: Cannot change IRQ 27 affinity: Operation not permitted
Dec 08 19:28:19 np0005550792.novalocal irqbalance[785]: IRQ 27 affinity is now unmanaged
Dec 08 19:28:24 np0005550792.novalocal sshd[1006]: Timeout before authentication for connection from 14.103.76.234 to 38.102.83.66, pid = 7475
Dec 08 19:28:28 np0005550792.novalocal python3[18040]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-d1da-5aca-00000000000a-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:28:29 np0005550792.novalocal kernel: evm: overlay not supported
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: Starting D-Bus User Message Bus...
Dec 08 19:28:29 np0005550792.novalocal dbus-broker-launch[18505]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 08 19:28:29 np0005550792.novalocal dbus-broker-launch[18505]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: Started D-Bus User Message Bus.
Dec 08 19:28:29 np0005550792.novalocal dbus-broker-lau[18505]: Ready
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: Created slice Slice /user.
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: podman-18450.scope: unit configures an IP firewall, but not running as root.
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: (This warning is only shown for the first unit using IP firewalling.)
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: Started podman-18450.scope.
Dec 08 19:28:29 np0005550792.novalocal systemd[4299]: Started podman-pause-5872d093.scope.
Dec 08 19:28:30 np0005550792.novalocal sudo[18625]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uelhftaoifhwremhrlygfzkkhxnyxqkv ; /usr/bin/python3'
Dec 08 19:28:30 np0005550792.novalocal sudo[18625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:30 np0005550792.novalocal python3[18634]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                       location = "38.102.83.51:5001"
                                                       insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                       location = "38.102.83.51:5001"
                                                       insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:28:30 np0005550792.novalocal python3[18634]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 08 19:28:30 np0005550792.novalocal sudo[18625]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:30 np0005550792.novalocal sshd-session[8071]: Connection closed by 38.102.83.114 port 35934
Dec 08 19:28:30 np0005550792.novalocal sshd-session[8068]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:28:30 np0005550792.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Dec 08 19:28:30 np0005550792.novalocal systemd[1]: session-4.scope: Consumed 1min 10.043s CPU time.
Dec 08 19:28:30 np0005550792.novalocal systemd-logind[793]: Session 4 logged out. Waiting for processes to exit.
Dec 08 19:28:30 np0005550792.novalocal systemd-logind[793]: Removed session 4.
Dec 08 19:28:38 np0005550792.novalocal sshd-session[21631]: Invalid user mark from 159.223.8.81 port 38280
Dec 08 19:28:38 np0005550792.novalocal sshd-session[21631]: Received disconnect from 159.223.8.81 port 38280:11: Bye Bye [preauth]
Dec 08 19:28:38 np0005550792.novalocal sshd-session[21631]: Disconnected from invalid user mark 159.223.8.81 port 38280 [preauth]
Dec 08 19:28:49 np0005550792.novalocal sshd-session[25710]: Connection closed by 38.102.83.58 port 47202 [preauth]
Dec 08 19:28:49 np0005550792.novalocal sshd-session[25709]: Connection closed by 38.102.83.58 port 47188 [preauth]
Dec 08 19:28:49 np0005550792.novalocal sshd-session[25715]: Unable to negotiate with 38.102.83.58 port 47222: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 08 19:28:49 np0005550792.novalocal sshd-session[25712]: Unable to negotiate with 38.102.83.58 port 47224: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 08 19:28:49 np0005550792.novalocal sshd-session[25713]: Unable to negotiate with 38.102.83.58 port 47206: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 08 19:28:53 np0005550792.novalocal sshd-session[27244]: Accepted publickey for zuul from 38.102.83.114 port 51130 ssh2: RSA SHA256:bK2Mc9f2rA3L0c6VdnlzeF3oH14XpBgPr+Tkg7h4pNY
Dec 08 19:28:53 np0005550792.novalocal systemd-logind[793]: New session 5 of user zuul.
Dec 08 19:28:53 np0005550792.novalocal systemd[1]: Started Session 5 of User zuul.
Dec 08 19:28:54 np0005550792.novalocal sshd-session[27244]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:28:54 np0005550792.novalocal python3[27337]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPPeJ5SrOxHUxKBFDqO3LDUlU8liHkEglhjDLRlQtSMTcSRlZ81tLZUIxhT3LiGjkd3YPnewpMKGPMGtTg+ghFI= zuul@np0005550791.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:28:54 np0005550792.novalocal sudo[27488]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpalyiiijcjaiookancqykdizbzduyfv ; /usr/bin/python3'
Dec 08 19:28:54 np0005550792.novalocal sudo[27488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:54 np0005550792.novalocal python3[27501]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPPeJ5SrOxHUxKBFDqO3LDUlU8liHkEglhjDLRlQtSMTcSRlZ81tLZUIxhT3LiGjkd3YPnewpMKGPMGtTg+ghFI= zuul@np0005550791.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:28:54 np0005550792.novalocal sudo[27488]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:55 np0005550792.novalocal sudo[27842]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhqbkedmburpaqeigpjgqkysphjntzbc ; /usr/bin/python3'
Dec 08 19:28:55 np0005550792.novalocal sudo[27842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:55 np0005550792.novalocal python3[27852]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005550792.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 08 19:28:55 np0005550792.novalocal useradd[27914]: new group: name=cloud-admin, GID=1002
Dec 08 19:28:55 np0005550792.novalocal useradd[27914]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Dec 08 19:28:55 np0005550792.novalocal sudo[27842]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:55 np0005550792.novalocal sudo[28049]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eomvuktfdtlfrriupvftpjcdiejluuii ; /usr/bin/python3'
Dec 08 19:28:55 np0005550792.novalocal sudo[28049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:56 np0005550792.novalocal python3[28060]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPPeJ5SrOxHUxKBFDqO3LDUlU8liHkEglhjDLRlQtSMTcSRlZ81tLZUIxhT3LiGjkd3YPnewpMKGPMGtTg+ghFI= zuul@np0005550791.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 08 19:28:56 np0005550792.novalocal sudo[28049]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:56 np0005550792.novalocal sudo[28309]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrvaopobsiqqlkndvvwrqviwkijqvrno ; /usr/bin/python3'
Dec 08 19:28:56 np0005550792.novalocal sudo[28309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:56 np0005550792.novalocal python3[28317]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:28:56 np0005550792.novalocal sudo[28309]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:56 np0005550792.novalocal sudo[28580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osflovrpounqgnstmyofuftcgsmdbtmk ; /usr/bin/python3'
Dec 08 19:28:56 np0005550792.novalocal sudo[28580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:57 np0005550792.novalocal python3[28587]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765222136.294853-135-35550823083540/source _original_basename=tmp1x8aj5s5 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:28:57 np0005550792.novalocal sudo[28580]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:57 np0005550792.novalocal sudo[28897]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfytsixpvqunqdcxbliowxtrkcwusfoj ; /usr/bin/python3'
Dec 08 19:28:57 np0005550792.novalocal sudo[28897]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:28:57 np0005550792.novalocal python3[28907]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Dec 08 19:28:57 np0005550792.novalocal systemd[1]: Starting Hostname Service...
Dec 08 19:28:57 np0005550792.novalocal systemd[1]: Started Hostname Service.
Dec 08 19:28:57 np0005550792.novalocal systemd-hostnamed[29017]: Changed pretty hostname to 'compute-0'
Dec 08 19:28:57 compute-0 systemd-hostnamed[29017]: Hostname set to <compute-0> (static)
Dec 08 19:28:57 compute-0 NetworkManager[7186]: <info>  [1765222137.9905] hostname: static hostname changed from "np0005550792.novalocal" to "compute-0"
Dec 08 19:28:58 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:28:58 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:28:58 compute-0 sudo[28897]: pam_unix(sudo:session): session closed for user root
Dec 08 19:28:58 compute-0 sshd-session[27280]: Connection closed by 38.102.83.114 port 51130
Dec 08 19:28:58 compute-0 sshd-session[27244]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:28:58 compute-0 systemd[1]: session-5.scope: Deactivated successfully.
Dec 08 19:28:58 compute-0 systemd[1]: session-5.scope: Consumed 2.386s CPU time.
Dec 08 19:28:58 compute-0 systemd-logind[793]: Session 5 logged out. Waiting for processes to exit.
Dec 08 19:28:58 compute-0 systemd-logind[793]: Removed session 5.
Dec 08 19:29:00 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:29:00 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:29:00 compute-0 systemd[1]: man-db-cache-update.service: Consumed 59.150s CPU time.
Dec 08 19:29:00 compute-0 systemd[1]: run-reae4470c1cf94c92aaacceab86a4d381.service: Deactivated successfully.
Dec 08 19:29:07 compute-0 sshd[1006]: Timeout before authentication for connection from 115.190.25.109 to 38.102.83.66, pid = 8158
Dec 08 19:29:08 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:29:09 compute-0 sshd-session[30052]: Received disconnect from 172.190.42.55 port 44240:11: Bye Bye [preauth]
Dec 08 19:29:09 compute-0 sshd-session[30052]: Disconnected from authenticating user root 172.190.42.55 port 44240 [preauth]
Dec 08 19:29:28 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 08 19:29:42 compute-0 sshd-session[30059]: Invalid user shabnam from 159.223.8.81 port 46492
Dec 08 19:29:42 compute-0 sshd-session[30059]: Received disconnect from 159.223.8.81 port 46492:11: Bye Bye [preauth]
Dec 08 19:29:42 compute-0 sshd-session[30059]: Disconnected from invalid user shabnam 159.223.8.81 port 46492 [preauth]
Dec 08 19:29:48 compute-0 sshd-session[30061]: Invalid user bob from 45.78.228.32 port 56880
Dec 08 19:29:49 compute-0 sshd-session[30061]: Received disconnect from 45.78.228.32 port 56880:11: Bye Bye [preauth]
Dec 08 19:29:49 compute-0 sshd-session[30061]: Disconnected from invalid user bob 45.78.228.32 port 56880 [preauth]
Dec 08 19:30:04 compute-0 sshd-session[30063]: Received disconnect from 101.47.160.247 port 40820:11: Bye Bye [preauth]
Dec 08 19:30:04 compute-0 sshd-session[30063]: Disconnected from authenticating user root 101.47.160.247 port 40820 [preauth]
Dec 08 19:30:20 compute-0 sshd-session[30065]: Received disconnect from 172.190.42.55 port 52836:11: Bye Bye [preauth]
Dec 08 19:30:20 compute-0 sshd-session[30065]: Disconnected from authenticating user root 172.190.42.55 port 52836 [preauth]
Dec 08 19:30:32 compute-0 sshd-session[30069]: Received disconnect from 222.172.32.246 port 2167:11: Bye Bye [preauth]
Dec 08 19:30:32 compute-0 sshd-session[30069]: Disconnected from authenticating user root 222.172.32.246 port 2167 [preauth]
Dec 08 19:30:44 compute-0 sshd-session[30071]: Received disconnect from 159.223.8.81 port 47524:11: Bye Bye [preauth]
Dec 08 19:30:44 compute-0 sshd-session[30071]: Disconnected from authenticating user root 159.223.8.81 port 47524 [preauth]
Dec 08 19:31:30 compute-0 sshd-session[30073]: Invalid user httpd from 172.190.42.55 port 33238
Dec 08 19:31:30 compute-0 sshd-session[30073]: Received disconnect from 172.190.42.55 port 33238:11: Bye Bye [preauth]
Dec 08 19:31:30 compute-0 sshd-session[30073]: Disconnected from invalid user httpd 172.190.42.55 port 33238 [preauth]
Dec 08 19:31:41 compute-0 sshd-session[30075]: Received disconnect from 159.223.8.81 port 56320:11: Bye Bye [preauth]
Dec 08 19:31:41 compute-0 sshd-session[30075]: Disconnected from authenticating user root 159.223.8.81 port 56320 [preauth]
Dec 08 19:32:27 compute-0 sshd-session[30077]: Received disconnect from 101.47.160.247 port 55914:11: Bye Bye [preauth]
Dec 08 19:32:27 compute-0 sshd-session[30077]: Disconnected from authenticating user root 101.47.160.247 port 55914 [preauth]
Dec 08 19:32:38 compute-0 sshd-session[30079]: Invalid user user21 from 159.223.8.81 port 34906
Dec 08 19:32:38 compute-0 sshd-session[30079]: Received disconnect from 159.223.8.81 port 34906:11: Bye Bye [preauth]
Dec 08 19:32:38 compute-0 sshd-session[30079]: Disconnected from invalid user user21 159.223.8.81 port 34906 [preauth]
Dec 08 19:32:42 compute-0 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 08 19:32:42 compute-0 sshd-session[30081]: Invalid user postgres from 172.190.42.55 port 36994
Dec 08 19:32:42 compute-0 sshd-session[30081]: Received disconnect from 172.190.42.55 port 36994:11: Bye Bye [preauth]
Dec 08 19:32:42 compute-0 sshd-session[30081]: Disconnected from invalid user postgres 172.190.42.55 port 36994 [preauth]
Dec 08 19:32:42 compute-0 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 08 19:32:42 compute-0 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 08 19:32:42 compute-0 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 08 19:33:37 compute-0 sshd-session[30085]: Received disconnect from 159.223.8.81 port 48772:11: Bye Bye [preauth]
Dec 08 19:33:37 compute-0 sshd-session[30085]: Disconnected from authenticating user root 159.223.8.81 port 48772 [preauth]
Dec 08 19:33:49 compute-0 sshd-session[30087]: Accepted publickey for zuul from 38.102.83.58 port 47966 ssh2: RSA SHA256:bK2Mc9f2rA3L0c6VdnlzeF3oH14XpBgPr+Tkg7h4pNY
Dec 08 19:33:49 compute-0 systemd-logind[793]: New session 6 of user zuul.
Dec 08 19:33:49 compute-0 systemd[1]: Started Session 6 of User zuul.
Dec 08 19:33:49 compute-0 sshd-session[30087]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:33:49 compute-0 python3[30163]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:33:51 compute-0 sudo[30277]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbxctvruzecikabpzwfsiokcjfgrmgib ; /usr/bin/python3'
Dec 08 19:33:51 compute-0 sudo[30277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:51 compute-0 python3[30279]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:51 compute-0 sudo[30277]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:51 compute-0 sudo[30350]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucaqoemuuziylkrnlnjtawpfgnardtqr ; /usr/bin/python3'
Dec 08 19:33:51 compute-0 sudo[30350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:51 compute-0 python3[30352]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:51 compute-0 sudo[30350]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:52 compute-0 sudo[30376]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcvcqomqoazoprbspradnuswunexzkbm ; /usr/bin/python3'
Dec 08 19:33:52 compute-0 sudo[30376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:52 compute-0 python3[30378]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:52 compute-0 sudo[30376]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:52 compute-0 sudo[30449]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrazourpgpeyknmxjmvxtisucxqvjbsk ; /usr/bin/python3'
Dec 08 19:33:52 compute-0 sudo[30449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:52 compute-0 python3[30451]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:52 compute-0 sudo[30449]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:52 compute-0 sudo[30475]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnephpspsfqecqpreqoqmtzrthqzkbrn ; /usr/bin/python3'
Dec 08 19:33:52 compute-0 sudo[30475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:52 compute-0 python3[30477]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:52 compute-0 sudo[30475]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:53 compute-0 sudo[30548]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckmvrobtxxfnwknjfddprbkbmkxpslgl ; /usr/bin/python3'
Dec 08 19:33:53 compute-0 sudo[30548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:53 compute-0 python3[30550]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:53 compute-0 sudo[30548]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:53 compute-0 sudo[30574]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agdhvwfspsvzrlwmwufgnzmvoakdqjau ; /usr/bin/python3'
Dec 08 19:33:53 compute-0 sudo[30574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:53 compute-0 python3[30576]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:53 compute-0 sudo[30574]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:53 compute-0 sudo[30647]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hecpnekekpqzfubvlwqtwqepcoisxiin ; /usr/bin/python3'
Dec 08 19:33:53 compute-0 sudo[30647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:53 compute-0 python3[30649]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:53 compute-0 sudo[30647]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:53 compute-0 sudo[30673]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elassfvejnaaiveubuzoegqiixxsesmc ; /usr/bin/python3'
Dec 08 19:33:53 compute-0 sudo[30673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:54 compute-0 python3[30675]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:54 compute-0 sudo[30673]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:54 compute-0 sudo[30746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oshnbnmoldlormhywdxljnonltrexnka ; /usr/bin/python3'
Dec 08 19:33:54 compute-0 sudo[30746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:54 compute-0 python3[30748]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:54 compute-0 sudo[30746]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:54 compute-0 sudo[30772]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxumdrjllgjxypkwsbuslfirtdbfqbgy ; /usr/bin/python3'
Dec 08 19:33:54 compute-0 sudo[30772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:54 compute-0 python3[30774]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:54 compute-0 sudo[30772]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:54 compute-0 sudo[30845]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kauuqskilrqfhyaeeipbujkqxouysvvv ; /usr/bin/python3'
Dec 08 19:33:54 compute-0 sudo[30845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:55 compute-0 python3[30847]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:55 compute-0 sudo[30845]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:55 compute-0 sudo[30871]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dscjrscnjxwbjpyxplkzrmjmahyiipuv ; /usr/bin/python3'
Dec 08 19:33:55 compute-0 sudo[30871]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:55 compute-0 python3[30873]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 08 19:33:55 compute-0 sudo[30871]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:55 compute-0 sudo[30944]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqkjhxjwozgkwglrliowbavedpszehlb ; /usr/bin/python3'
Dec 08 19:33:55 compute-0 sudo[30944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:33:55 compute-0 python3[30946]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765222431.125575-33728-153939741739936/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:33:55 compute-0 sudo[30944]: pam_unix(sudo:session): session closed for user root
Dec 08 19:33:57 compute-0 sshd-session[30973]: Connection closed by 192.168.122.11 port 50534 [preauth]
Dec 08 19:33:57 compute-0 sshd-session[30977]: Connection closed by 192.168.122.11 port 50544 [preauth]
Dec 08 19:33:57 compute-0 sshd-session[30974]: Unable to negotiate with 192.168.122.11 port 50550: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Dec 08 19:33:57 compute-0 sshd-session[30975]: Unable to negotiate with 192.168.122.11 port 50562: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Dec 08 19:33:57 compute-0 sshd-session[30976]: Unable to negotiate with 192.168.122.11 port 50578: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Dec 08 19:33:58 compute-0 sshd-session[30971]: Invalid user socks from 172.190.42.55 port 49812
Dec 08 19:33:58 compute-0 sshd-session[30971]: Received disconnect from 172.190.42.55 port 49812:11: Bye Bye [preauth]
Dec 08 19:33:58 compute-0 sshd-session[30971]: Disconnected from invalid user socks 172.190.42.55 port 49812 [preauth]
Dec 08 19:34:52 compute-0 python3[31009]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:34:53 compute-0 sshd-session[30984]: Invalid user cheeki from 101.47.160.247 port 35672
Dec 08 19:34:56 compute-0 sshd-session[30984]: Received disconnect from 101.47.160.247 port 35672:11: Bye Bye [preauth]
Dec 08 19:34:56 compute-0 sshd-session[30984]: Disconnected from invalid user cheeki 101.47.160.247 port 35672 [preauth]
Dec 08 19:35:14 compute-0 sshd-session[31011]: Invalid user shabnam from 172.190.42.55 port 35434
Dec 08 19:35:14 compute-0 sshd-session[31011]: Received disconnect from 172.190.42.55 port 35434:11: Bye Bye [preauth]
Dec 08 19:35:14 compute-0 sshd-session[31011]: Disconnected from invalid user shabnam 172.190.42.55 port 35434 [preauth]
Dec 08 19:35:21 compute-0 sshd-session[31013]: Received disconnect from 159.223.8.81 port 49494:11: Bye Bye [preauth]
Dec 08 19:35:21 compute-0 sshd-session[31013]: Disconnected from authenticating user root 159.223.8.81 port 49494 [preauth]
Dec 08 19:36:19 compute-0 sshd-session[31016]: Invalid user cheeki from 159.223.8.81 port 34306
Dec 08 19:36:19 compute-0 sshd-session[31016]: Received disconnect from 159.223.8.81 port 34306:11: Bye Bye [preauth]
Dec 08 19:36:19 compute-0 sshd-session[31016]: Disconnected from invalid user cheeki 159.223.8.81 port 34306 [preauth]
Dec 08 19:36:30 compute-0 sshd-session[31018]: Received disconnect from 172.190.42.55 port 59292:11: Bye Bye [preauth]
Dec 08 19:36:30 compute-0 sshd-session[31018]: Disconnected from authenticating user root 172.190.42.55 port 59292 [preauth]
Dec 08 19:36:54 compute-0 sshd-session[31020]: Received disconnect from 45.78.228.32 port 36516:11: Bye Bye [preauth]
Dec 08 19:36:54 compute-0 sshd-session[31020]: Disconnected from authenticating user root 45.78.228.32 port 36516 [preauth]
Dec 08 19:37:16 compute-0 sshd-session[31024]: Received disconnect from 159.223.8.81 port 33006:11: Bye Bye [preauth]
Dec 08 19:37:16 compute-0 sshd-session[31024]: Disconnected from authenticating user root 159.223.8.81 port 33006 [preauth]
Dec 08 19:37:45 compute-0 sshd-session[31026]: Invalid user user21 from 172.190.42.55 port 53236
Dec 08 19:37:45 compute-0 sshd-session[31026]: Received disconnect from 172.190.42.55 port 53236:11: Bye Bye [preauth]
Dec 08 19:37:45 compute-0 sshd-session[31026]: Disconnected from invalid user user21 172.190.42.55 port 53236 [preauth]
Dec 08 19:38:12 compute-0 sshd-session[31028]: Received disconnect from 159.223.8.81 port 57874:11: Bye Bye [preauth]
Dec 08 19:38:12 compute-0 sshd-session[31028]: Disconnected from authenticating user root 159.223.8.81 port 57874 [preauth]
Dec 08 19:38:50 compute-0 sshd-session[31030]: Connection closed by 115.190.24.142 port 53300
Dec 08 19:38:59 compute-0 sshd[1006]: Timeout before authentication for connection from 222.172.32.246 to 38.102.83.66, pid = 31022
Dec 08 19:39:03 compute-0 sshd-session[31032]: Invalid user amin from 172.190.42.55 port 38508
Dec 08 19:39:03 compute-0 sshd-session[31032]: Received disconnect from 172.190.42.55 port 38508:11: Bye Bye [preauth]
Dec 08 19:39:03 compute-0 sshd-session[31032]: Disconnected from invalid user amin 172.190.42.55 port 38508 [preauth]
Dec 08 19:39:10 compute-0 sshd-session[31034]: Received disconnect from 159.223.8.81 port 53420:11: Bye Bye [preauth]
Dec 08 19:39:10 compute-0 sshd-session[31034]: Disconnected from authenticating user root 159.223.8.81 port 53420 [preauth]
Dec 08 19:39:38 compute-0 sshd-session[31036]: Received disconnect from 101.47.160.247 port 48548:11: Bye Bye [preauth]
Dec 08 19:39:38 compute-0 sshd-session[31036]: Disconnected from authenticating user root 101.47.160.247 port 48548 [preauth]
Dec 08 19:39:52 compute-0 sshd-session[30090]: Received disconnect from 38.102.83.58 port 47966:11: disconnected by user
Dec 08 19:39:52 compute-0 sshd-session[30090]: Disconnected from user zuul 38.102.83.58 port 47966
Dec 08 19:39:52 compute-0 sshd-session[30087]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:39:52 compute-0 systemd[1]: session-6.scope: Deactivated successfully.
Dec 08 19:39:52 compute-0 systemd[1]: session-6.scope: Consumed 5.199s CPU time.
Dec 08 19:39:52 compute-0 systemd-logind[793]: Session 6 logged out. Waiting for processes to exit.
Dec 08 19:39:52 compute-0 systemd-logind[793]: Removed session 6.
Dec 08 19:40:10 compute-0 sshd[1006]: drop connection #1 from [222.172.32.246]:2169 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 19:40:12 compute-0 sshd-session[31038]: Invalid user postgres from 159.223.8.81 port 60618
Dec 08 19:40:12 compute-0 sshd-session[31038]: Received disconnect from 159.223.8.81 port 60618:11: Bye Bye [preauth]
Dec 08 19:40:12 compute-0 sshd-session[31038]: Disconnected from invalid user postgres 159.223.8.81 port 60618 [preauth]
Dec 08 19:40:21 compute-0 sshd-session[31041]: Invalid user maarch from 172.190.42.55 port 39404
Dec 08 19:40:21 compute-0 sshd-session[31041]: Received disconnect from 172.190.42.55 port 39404:11: Bye Bye [preauth]
Dec 08 19:40:21 compute-0 sshd-session[31041]: Disconnected from invalid user maarch 172.190.42.55 port 39404 [preauth]
Dec 08 19:40:53 compute-0 sshd[1006]: Timeout before authentication for connection from 115.190.24.142 to 38.102.83.66, pid = 31031
Dec 08 19:41:17 compute-0 sshd-session[31043]: Received disconnect from 159.223.8.81 port 43900:11: Bye Bye [preauth]
Dec 08 19:41:17 compute-0 sshd-session[31043]: Disconnected from authenticating user root 159.223.8.81 port 43900 [preauth]
Dec 08 19:41:18 compute-0 sshd-session[31045]: Received disconnect from 80.94.93.233 port 64010:11:  [preauth]
Dec 08 19:41:18 compute-0 sshd-session[31045]: Disconnected from authenticating user root 80.94.93.233 port 64010 [preauth]
Dec 08 19:41:40 compute-0 sshd-session[31048]: Received disconnect from 172.190.42.55 port 33132:11: Bye Bye [preauth]
Dec 08 19:41:40 compute-0 sshd-session[31048]: Disconnected from authenticating user root 172.190.42.55 port 33132 [preauth]
Dec 08 19:41:54 compute-0 sshd-session[31050]: Received disconnect from 101.47.160.247 port 37408:11: Bye Bye [preauth]
Dec 08 19:41:54 compute-0 sshd-session[31050]: Disconnected from authenticating user root 101.47.160.247 port 37408 [preauth]
Dec 08 19:42:20 compute-0 sshd-session[31052]: Received disconnect from 159.223.8.81 port 60384:11: Bye Bye [preauth]
Dec 08 19:42:20 compute-0 sshd-session[31052]: Disconnected from authenticating user root 159.223.8.81 port 60384 [preauth]
Dec 08 19:42:56 compute-0 sshd-session[31054]: Received disconnect from 172.190.42.55 port 34578:11: Bye Bye [preauth]
Dec 08 19:42:56 compute-0 sshd-session[31054]: Disconnected from authenticating user root 172.190.42.55 port 34578 [preauth]
Dec 08 19:43:23 compute-0 sshd-session[31056]: Invalid user soporte from 159.223.8.81 port 55042
Dec 08 19:43:23 compute-0 sshd-session[31056]: Received disconnect from 159.223.8.81 port 55042:11: Bye Bye [preauth]
Dec 08 19:43:23 compute-0 sshd-session[31056]: Disconnected from invalid user soporte 159.223.8.81 port 55042 [preauth]
Dec 08 19:43:48 compute-0 sshd-session[31060]: Invalid user cortega from 45.78.228.32 port 53786
Dec 08 19:43:48 compute-0 sshd-session[31060]: Received disconnect from 45.78.228.32 port 53786:11: Bye Bye [preauth]
Dec 08 19:43:48 compute-0 sshd-session[31060]: Disconnected from invalid user cortega 45.78.228.32 port 53786 [preauth]
Dec 08 19:44:11 compute-0 sshd-session[31062]: Invalid user mark from 172.190.42.55 port 40308
Dec 08 19:44:11 compute-0 sshd-session[31062]: Received disconnect from 172.190.42.55 port 40308:11: Bye Bye [preauth]
Dec 08 19:44:11 compute-0 sshd-session[31062]: Disconnected from invalid user mark 172.190.42.55 port 40308 [preauth]
Dec 08 19:44:24 compute-0 sshd-session[31064]: Received disconnect from 159.223.8.81 port 45136:11: Bye Bye [preauth]
Dec 08 19:44:24 compute-0 sshd-session[31064]: Disconnected from authenticating user root 159.223.8.81 port 45136 [preauth]
Dec 08 19:45:23 compute-0 sshd-session[31066]: Invalid user front-user from 159.223.8.81 port 55830
Dec 08 19:45:23 compute-0 sshd-session[31066]: Received disconnect from 159.223.8.81 port 55830:11: Bye Bye [preauth]
Dec 08 19:45:23 compute-0 sshd-session[31066]: Disconnected from invalid user front-user 159.223.8.81 port 55830 [preauth]
Dec 08 19:45:24 compute-0 sshd-session[31068]: Received disconnect from 172.190.42.55 port 33072:11: Bye Bye [preauth]
Dec 08 19:45:24 compute-0 sshd-session[31068]: Disconnected from authenticating user root 172.190.42.55 port 33072 [preauth]
Dec 08 19:45:29 compute-0 sshd[1006]: Timeout before authentication for connection from 222.172.32.246 to 38.102.83.66, pid = 31058
Dec 08 19:46:23 compute-0 sshd-session[31071]: Received disconnect from 159.223.8.81 port 37138:11: Bye Bye [preauth]
Dec 08 19:46:23 compute-0 sshd-session[31071]: Disconnected from authenticating user root 159.223.8.81 port 37138 [preauth]
Dec 08 19:46:40 compute-0 sshd-session[31074]: Invalid user ftpuser from 172.190.42.55 port 40056
Dec 08 19:46:40 compute-0 sshd-session[31074]: Received disconnect from 172.190.42.55 port 40056:11: Bye Bye [preauth]
Dec 08 19:46:40 compute-0 sshd-session[31074]: Disconnected from invalid user ftpuser 172.190.42.55 port 40056 [preauth]
Dec 08 19:46:40 compute-0 sshd[1006]: drop connection #0 from [222.172.32.246]:2171 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 19:47:25 compute-0 sshd-session[31076]: Invalid user noc from 159.223.8.81 port 53498
Dec 08 19:47:25 compute-0 sshd-session[31076]: Received disconnect from 159.223.8.81 port 53498:11: Bye Bye [preauth]
Dec 08 19:47:25 compute-0 sshd-session[31076]: Disconnected from invalid user noc 159.223.8.81 port 53498 [preauth]
Dec 08 19:47:50 compute-0 sshd-session[31078]: Accepted publickey for zuul from 192.168.122.30 port 41596 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:47:50 compute-0 systemd-logind[793]: New session 7 of user zuul.
Dec 08 19:47:50 compute-0 systemd[1]: Started Session 7 of User zuul.
Dec 08 19:47:50 compute-0 sshd-session[31078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:47:51 compute-0 python3.9[31231]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:47:52 compute-0 sudo[31410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqdjuflgkhqgeghizopdizasikdbotuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223272.4395773-32-166360405350607/AnsiballZ_command.py'
Dec 08 19:47:52 compute-0 sudo[31410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:47:53 compute-0 python3.9[31412]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:47:57 compute-0 sshd-session[31436]: Invalid user ubuntu from 172.190.42.55 port 37436
Dec 08 19:47:57 compute-0 sshd-session[31436]: Received disconnect from 172.190.42.55 port 37436:11: Bye Bye [preauth]
Dec 08 19:47:57 compute-0 sshd-session[31436]: Disconnected from invalid user ubuntu 172.190.42.55 port 37436 [preauth]
Dec 08 19:48:00 compute-0 sudo[31410]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:00 compute-0 sshd-session[31081]: Connection closed by 192.168.122.30 port 41596
Dec 08 19:48:00 compute-0 sshd-session[31078]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:48:00 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Dec 08 19:48:00 compute-0 systemd[1]: session-7.scope: Consumed 7.862s CPU time.
Dec 08 19:48:00 compute-0 systemd-logind[793]: Session 7 logged out. Waiting for processes to exit.
Dec 08 19:48:00 compute-0 systemd-logind[793]: Removed session 7.
Dec 08 19:48:06 compute-0 sshd-session[31471]: Accepted publickey for zuul from 192.168.122.30 port 49526 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:48:06 compute-0 systemd-logind[793]: New session 8 of user zuul.
Dec 08 19:48:06 compute-0 systemd[1]: Started Session 8 of User zuul.
Dec 08 19:48:06 compute-0 sshd-session[31471]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:48:07 compute-0 python3.9[31624]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:48:08 compute-0 sshd-session[31474]: Connection closed by 192.168.122.30 port 49526
Dec 08 19:48:08 compute-0 sshd-session[31471]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:48:08 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Dec 08 19:48:08 compute-0 systemd-logind[793]: Session 8 logged out. Waiting for processes to exit.
Dec 08 19:48:08 compute-0 systemd-logind[793]: Removed session 8.
Dec 08 19:48:23 compute-0 sshd-session[31654]: Accepted publickey for zuul from 192.168.122.30 port 36564 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:48:23 compute-0 systemd-logind[793]: New session 9 of user zuul.
Dec 08 19:48:23 compute-0 systemd[1]: Started Session 9 of User zuul.
Dec 08 19:48:23 compute-0 sshd-session[31654]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:48:24 compute-0 python3.9[31809]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 08 19:48:24 compute-0 sshd-session[31759]: Invalid user maarch from 159.223.8.81 port 38348
Dec 08 19:48:24 compute-0 sshd-session[31759]: Received disconnect from 159.223.8.81 port 38348:11: Bye Bye [preauth]
Dec 08 19:48:24 compute-0 sshd-session[31759]: Disconnected from invalid user maarch 159.223.8.81 port 38348 [preauth]
Dec 08 19:48:25 compute-0 python3.9[31983]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:48:25 compute-0 sshd-session[31652]: Received disconnect from 45.78.228.32 port 58968:11: Bye Bye [preauth]
Dec 08 19:48:25 compute-0 sshd-session[31652]: Disconnected from authenticating user root 45.78.228.32 port 58968 [preauth]
Dec 08 19:48:26 compute-0 sudo[32133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqkdrcwdgrbiymlduknzroxkkheuwral ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223306.122468-45-269556318448964/AnsiballZ_command.py'
Dec 08 19:48:26 compute-0 sudo[32133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:26 compute-0 python3.9[32135]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:48:26 compute-0 sudo[32133]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:27 compute-0 sudo[32286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzjhgkmelnkebcfszfjkqglemniymkaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223306.9723816-57-117515840632471/AnsiballZ_stat.py'
Dec 08 19:48:27 compute-0 sudo[32286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:27 compute-0 python3.9[32288]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:48:27 compute-0 sudo[32286]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:28 compute-0 sudo[32438]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmqgzfncyeaukeminmpqnyavpjfjstmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223307.7413056-65-54712210227075/AnsiballZ_file.py'
Dec 08 19:48:28 compute-0 sudo[32438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:28 compute-0 python3.9[32440]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:48:28 compute-0 sudo[32438]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:28 compute-0 sudo[32590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdyqoxbeedrthuxiqkghbollylbtnunl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223308.507747-73-112894462793487/AnsiballZ_stat.py'
Dec 08 19:48:28 compute-0 sudo[32590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:28 compute-0 python3.9[32592]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:48:28 compute-0 sudo[32590]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:29 compute-0 sudo[32713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbbvjkpgeatauqibtbvyzaqsnvxbuilt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223308.507747-73-112894462793487/AnsiballZ_copy.py'
Dec 08 19:48:29 compute-0 sudo[32713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:29 compute-0 python3.9[32715]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223308.507747-73-112894462793487/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:48:29 compute-0 sudo[32713]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:30 compute-0 sudo[32865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibfecdwzfbpmguazbridphljegxmynox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223309.888183-88-270946122459604/AnsiballZ_setup.py'
Dec 08 19:48:30 compute-0 sudo[32865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:30 compute-0 python3.9[32867]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:48:30 compute-0 sudo[32865]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:31 compute-0 sudo[33021]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyyrxpkfrjocvxwpzxgrbxvdkinmdmnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223310.8657846-96-74489595763179/AnsiballZ_file.py'
Dec 08 19:48:31 compute-0 sudo[33021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:32 compute-0 python3.9[33023]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:48:32 compute-0 sudo[33021]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:32 compute-0 sudo[33173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jimkvyhyxxekxcchfxhgnahcbdxgfdcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223312.290195-105-122475405466713/AnsiballZ_file.py'
Dec 08 19:48:32 compute-0 sudo[33173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:32 compute-0 python3.9[33175]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:48:32 compute-0 sudo[33173]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:33 compute-0 python3.9[33325]: ansible-ansible.builtin.service_facts Invoked
Dec 08 19:48:37 compute-0 python3.9[33578]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:48:38 compute-0 python3.9[33728]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:48:39 compute-0 python3.9[33882]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:48:40 compute-0 sudo[34038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtvgfdhqklpnmzbxjjcumwbtetnodmar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223319.7647676-153-32435298584574/AnsiballZ_setup.py'
Dec 08 19:48:40 compute-0 sudo[34038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:40 compute-0 python3.9[34040]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:48:40 compute-0 sudo[34038]: pam_unix(sudo:session): session closed for user root
Dec 08 19:48:41 compute-0 sudo[34122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssjeijnmfvyzyqhwknnahjfzhucfnyrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223319.7647676-153-32435298584574/AnsiballZ_dnf.py'
Dec 08 19:48:41 compute-0 sudo[34122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:48:41 compute-0 python3.9[34124]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:48:59 compute-0 sshd-session[34215]: Received disconnect from 101.47.160.247 port 41880:11: Bye Bye [preauth]
Dec 08 19:48:59 compute-0 sshd-session[34215]: Disconnected from authenticating user root 101.47.160.247 port 41880 [preauth]
Dec 08 19:49:11 compute-0 sshd-session[34295]: Invalid user adi from 172.190.42.55 port 54024
Dec 08 19:49:11 compute-0 sshd-session[34295]: Received disconnect from 172.190.42.55 port 54024:11: Bye Bye [preauth]
Dec 08 19:49:11 compute-0 sshd-session[34295]: Disconnected from invalid user adi 172.190.42.55 port 54024 [preauth]
Dec 08 19:49:22 compute-0 sshd-session[34318]: Invalid user ubuntu from 159.223.8.81 port 40222
Dec 08 19:49:23 compute-0 sshd-session[34318]: Received disconnect from 159.223.8.81 port 40222:11: Bye Bye [preauth]
Dec 08 19:49:23 compute-0 sshd-session[34318]: Disconnected from invalid user ubuntu 159.223.8.81 port 40222 [preauth]
Dec 08 19:49:34 compute-0 systemd[1]: Reloading.
Dec 08 19:49:34 compute-0 systemd-rc-local-generator[34376]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:49:34 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 08 19:49:35 compute-0 systemd[1]: Reloading.
Dec 08 19:49:35 compute-0 systemd-rc-local-generator[34420]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:49:35 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 08 19:49:35 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 08 19:49:35 compute-0 systemd[1]: Reloading.
Dec 08 19:49:35 compute-0 systemd-rc-local-generator[34457]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:49:35 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 08 19:49:36 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:49:36 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:49:36 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:49:55 compute-0 sshd-session[34518]: Invalid user httpd from 222.172.32.246 port 2172
Dec 08 19:49:55 compute-0 sshd-session[34518]: Received disconnect from 222.172.32.246 port 2172:11: Bye Bye [preauth]
Dec 08 19:49:55 compute-0 sshd-session[34518]: Disconnected from invalid user httpd 222.172.32.246 port 2172 [preauth]
Dec 08 19:50:19 compute-0 sshd-session[34623]: Invalid user httpd from 159.223.8.81 port 60882
Dec 08 19:50:19 compute-0 sshd-session[34623]: Received disconnect from 159.223.8.81 port 60882:11: Bye Bye [preauth]
Dec 08 19:50:19 compute-0 sshd-session[34623]: Disconnected from invalid user httpd 159.223.8.81 port 60882 [preauth]
Dec 08 19:50:23 compute-0 sshd-session[34655]: Received disconnect from 172.190.42.55 port 41066:11: Bye Bye [preauth]
Dec 08 19:50:23 compute-0 sshd-session[34655]: Disconnected from authenticating user root 172.190.42.55 port 41066 [preauth]
Dec 08 19:50:45 compute-0 sshd-session[34702]: Received disconnect from 45.78.228.32 port 48130:11: Bye Bye [preauth]
Dec 08 19:50:45 compute-0 sshd-session[34702]: Disconnected from authenticating user root 45.78.228.32 port 48130 [preauth]
Dec 08 19:50:50 compute-0 kernel: SELinux:  Converting 2717 SID table entries...
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:50:50 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:50:51 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 08 19:50:51 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:50:51 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:50:51 compute-0 systemd[1]: Reloading.
Dec 08 19:50:51 compute-0 systemd-rc-local-generator[34818]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:50:51 compute-0 systemd[1]: Starting dnf makecache...
Dec 08 19:50:51 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:50:51 compute-0 dnf[34853]: Failed determining last makecache time.
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-barbican-42b4c41831408a8e323 133 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 189 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-cinder-1c00d6490d88e436f26ef 188 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-python-stevedore-c4acc5639fd2329372142 199 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-python-cloudkitty-tests-tempest-2c80f8 185 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-os-refresh-config-9bfc52b5049be2d8de61 149 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 203 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-python-designate-tests-tempest-347fdbc 205 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-glance-1fd12c29b339f30fe823e 207 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 204 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-manila-3c01b7181572c95dac462 199 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-python-whitebox-neutron-tests-tempest- 189 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-octavia-ba397f07a7331190208c 195 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-openstack-watcher-c014f81a8647287f6dcc 200 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-ansible-config_template-5ccaa22121a7ff 192 kB/s | 3.0 kB     00:00
Dec 08 19:50:51 compute-0 dnf[34853]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 182 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: delorean-python-tempestconf-8515371b7cceebd4282 188 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: delorean-openstack-heat-ui-013accbfd179753bc3f0 171 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: CentOS Stream 9 - BaseOS                         59 kB/s | 7.3 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: CentOS Stream 9 - AppStream                      75 kB/s | 7.4 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: CentOS Stream 9 - CRB                            71 kB/s | 7.2 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: CentOS Stream 9 - Extras packages                71 kB/s | 8.3 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: dlrn-antelope-testing                           148 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: dlrn-antelope-build-deps                        182 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: centos9-rabbitmq                                 86 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:50:52 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:50:52 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.128s CPU time.
Dec 08 19:50:52 compute-0 systemd[1]: run-rce083691a075479a95ae460f5268b2ef.service: Deactivated successfully.
Dec 08 19:50:52 compute-0 dnf[34853]: centos9-storage                                  38 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: centos9-opstools                                115 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 sudo[34122]: pam_unix(sudo:session): session closed for user root
Dec 08 19:50:52 compute-0 dnf[34853]: NFV SIG OpenvSwitch                              49 kB/s | 3.0 kB     00:00
Dec 08 19:50:52 compute-0 dnf[34853]: repo-setup-centos-appstream                     147 kB/s | 4.4 kB     00:00
Dec 08 19:50:53 compute-0 dnf[34853]: repo-setup-centos-baseos                        172 kB/s | 3.9 kB     00:00
Dec 08 19:50:53 compute-0 dnf[34853]: repo-setup-centos-highavailability              151 kB/s | 3.9 kB     00:00
Dec 08 19:50:53 compute-0 dnf[34853]: repo-setup-centos-powertools                    150 kB/s | 4.3 kB     00:00
Dec 08 19:50:53 compute-0 dnf[34853]: Extra Packages for Enterprise Linux 9 - x86_64  283 kB/s |  35 kB     00:00
Dec 08 19:50:53 compute-0 sudo[35777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wihimryjnvxzdntcbdkukrusqdlstial ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223453.137603-165-270051173166749/AnsiballZ_command.py'
Dec 08 19:50:53 compute-0 sudo[35777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:50:53 compute-0 python3.9[35779]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:50:53 compute-0 dnf[34853]: Metadata cache created.
Dec 08 19:50:53 compute-0 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 08 19:50:53 compute-0 systemd[1]: Finished dnf makecache.
Dec 08 19:50:53 compute-0 systemd[1]: dnf-makecache.service: Consumed 1.659s CPU time.
Dec 08 19:50:54 compute-0 sudo[35777]: pam_unix(sudo:session): session closed for user root
Dec 08 19:50:55 compute-0 sudo[36058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-foapmbxfxcbagwcticbxlgajebovuvfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223454.6384475-173-279446896309767/AnsiballZ_selinux.py'
Dec 08 19:50:55 compute-0 sudo[36058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:50:55 compute-0 python3.9[36060]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 08 19:50:55 compute-0 sudo[36058]: pam_unix(sudo:session): session closed for user root
Dec 08 19:50:56 compute-0 sudo[36210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jccbglkzlpxfvhqvvlxgjadkcqwetixt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223455.927516-184-83535288209146/AnsiballZ_command.py'
Dec 08 19:50:56 compute-0 sudo[36210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:50:56 compute-0 python3.9[36212]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 08 19:50:57 compute-0 sudo[36210]: pam_unix(sudo:session): session closed for user root
Dec 08 19:50:57 compute-0 sudo[36363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhfxulyjcpqypnrjurtsyswpgtctgehf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223457.5769117-192-160371175063134/AnsiballZ_file.py'
Dec 08 19:50:57 compute-0 sudo[36363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:50:58 compute-0 python3.9[36365]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:50:58 compute-0 sudo[36363]: pam_unix(sudo:session): session closed for user root
Dec 08 19:50:59 compute-0 sudo[36515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vefwrbsanakyseehlnuupzxtzqoxzjcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223458.8531246-200-263701728368870/AnsiballZ_mount.py'
Dec 08 19:50:59 compute-0 sudo[36515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:50:59 compute-0 python3.9[36517]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 08 19:50:59 compute-0 sudo[36515]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:00 compute-0 sudo[36667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bekfaukggenpcwhqubwrashztvdwassa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223460.691792-228-8906199528457/AnsiballZ_file.py'
Dec 08 19:51:00 compute-0 sudo[36667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:01 compute-0 python3.9[36669]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:51:01 compute-0 sudo[36667]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:01 compute-0 sudo[36819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flunbjzxhakntxhqkjzojbldprfkzsxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223461.347168-236-165835679772824/AnsiballZ_stat.py'
Dec 08 19:51:01 compute-0 sudo[36819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:01 compute-0 python3.9[36821]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:51:01 compute-0 sudo[36819]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:02 compute-0 sudo[36942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cocudceysmonopskibsmmoguxhpoliqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223461.347168-236-165835679772824/AnsiballZ_copy.py'
Dec 08 19:51:02 compute-0 sudo[36942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:02 compute-0 python3.9[36944]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223461.347168-236-165835679772824/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:51:02 compute-0 sudo[36942]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:03 compute-0 sudo[37094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkssqsnflbuumnntxjplmjrxysipywlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223462.897516-260-126899940086421/AnsiballZ_stat.py'
Dec 08 19:51:03 compute-0 sudo[37094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:05 compute-0 python3.9[37096]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:51:05 compute-0 sudo[37094]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:06 compute-0 sudo[37246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnndnwpowrgessrrateyyscngevenxjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223465.7821965-268-120304159768807/AnsiballZ_command.py'
Dec 08 19:51:06 compute-0 sudo[37246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:10 compute-0 python3.9[37248]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:10 compute-0 sudo[37246]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:10 compute-0 sudo[37399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yifmiolvsblskcpyudwmfutcwrwvcodz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223470.3038673-276-195350883038535/AnsiballZ_file.py'
Dec 08 19:51:10 compute-0 sudo[37399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:10 compute-0 python3.9[37401]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:51:10 compute-0 sudo[37399]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:11 compute-0 sudo[37551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozrgveeumhrzaxigioiijraifjrfszed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223471.2890577-287-232158930109713/AnsiballZ_getent.py'
Dec 08 19:51:11 compute-0 sudo[37551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:11 compute-0 python3.9[37553]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 08 19:51:11 compute-0 sudo[37551]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:11 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 19:51:12 compute-0 sudo[37705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zajcbkzijztknfljuxvdobbsvpibekrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223472.112903-295-232385388656852/AnsiballZ_group.py'
Dec 08 19:51:12 compute-0 sudo[37705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:12 compute-0 python3.9[37707]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 19:51:13 compute-0 groupadd[37708]: group added to /etc/group: name=qemu, GID=107
Dec 08 19:51:13 compute-0 groupadd[37708]: group added to /etc/gshadow: name=qemu
Dec 08 19:51:13 compute-0 groupadd[37708]: new group: name=qemu, GID=107
Dec 08 19:51:13 compute-0 sudo[37705]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:13 compute-0 sudo[37863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnhyjwjepbmslssjvutelurncialnany ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223473.4488137-303-58397520513628/AnsiballZ_user.py'
Dec 08 19:51:13 compute-0 sudo[37863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:14 compute-0 python3.9[37865]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 08 19:51:14 compute-0 useradd[37867]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Dec 08 19:51:14 compute-0 sudo[37863]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:14 compute-0 sshd-session[37874]: Invalid user adi from 159.223.8.81 port 37694
Dec 08 19:51:14 compute-0 sshd-session[37874]: Received disconnect from 159.223.8.81 port 37694:11: Bye Bye [preauth]
Dec 08 19:51:14 compute-0 sshd-session[37874]: Disconnected from invalid user adi 159.223.8.81 port 37694 [preauth]
Dec 08 19:51:14 compute-0 sudo[38025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjegchbcqtmcjtsrbsafycbpfkgfvpci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223474.7340841-311-263244944171777/AnsiballZ_getent.py'
Dec 08 19:51:14 compute-0 sudo[38025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:15 compute-0 python3.9[38027]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 08 19:51:15 compute-0 sudo[38025]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:15 compute-0 sudo[38180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjeetrgiextfapvntdxipmjnqykedrck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223475.383592-319-265977318377716/AnsiballZ_group.py'
Dec 08 19:51:15 compute-0 sudo[38180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:16 compute-0 python3.9[38182]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 19:51:16 compute-0 groupadd[38183]: group added to /etc/group: name=hugetlbfs, GID=42477
Dec 08 19:51:16 compute-0 groupadd[38183]: group added to /etc/gshadow: name=hugetlbfs
Dec 08 19:51:16 compute-0 groupadd[38183]: new group: name=hugetlbfs, GID=42477
Dec 08 19:51:16 compute-0 sudo[38180]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:16 compute-0 sudo[38338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhyvvygqrndaqnudnyrphhqtlrbvjdii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223476.4417844-328-238912691858015/AnsiballZ_file.py'
Dec 08 19:51:16 compute-0 sudo[38338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:16 compute-0 python3.9[38340]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 08 19:51:16 compute-0 sudo[38338]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:17 compute-0 sudo[38490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwhcllegabnatjbwgcuobmhnhaknvhlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223477.2432084-339-233321311384833/AnsiballZ_dnf.py'
Dec 08 19:51:17 compute-0 sudo[38490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:17 compute-0 python3.9[38492]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:51:19 compute-0 sudo[38490]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:20 compute-0 sudo[38643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkdgucjgranggmfjzeidocpikkcwlyis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223480.0209386-347-206623833425024/AnsiballZ_file.py'
Dec 08 19:51:20 compute-0 sudo[38643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:20 compute-0 python3.9[38645]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:51:20 compute-0 sudo[38643]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:21 compute-0 sudo[38795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjcrbgryiugdkelxlqyyvtcjmecdeghd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223480.8729165-355-225629581062983/AnsiballZ_stat.py'
Dec 08 19:51:21 compute-0 sudo[38795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:21 compute-0 python3.9[38797]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:51:21 compute-0 sudo[38795]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:21 compute-0 sudo[38918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llmszzunwvrxkwaahndyucvydgeddhpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223480.8729165-355-225629581062983/AnsiballZ_copy.py'
Dec 08 19:51:21 compute-0 sudo[38918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:21 compute-0 python3.9[38920]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223480.8729165-355-225629581062983/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:51:21 compute-0 sudo[38918]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:22 compute-0 sudo[39070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxwjeqhpughpjytldsjjzkzaexzmevwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223482.0852559-370-178083205284569/AnsiballZ_systemd.py'
Dec 08 19:51:22 compute-0 sudo[39070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:23 compute-0 python3.9[39072]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:51:23 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 08 19:51:23 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 08 19:51:23 compute-0 kernel: Bridge firewalling registered
Dec 08 19:51:23 compute-0 systemd-modules-load[39076]: Inserted module 'br_netfilter'
Dec 08 19:51:23 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 08 19:51:23 compute-0 sudo[39070]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:23 compute-0 sudo[39229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwqpulqtvbccecodifbpimnvidraoasp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223483.3425484-378-52120194298068/AnsiballZ_stat.py'
Dec 08 19:51:23 compute-0 sudo[39229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:23 compute-0 python3.9[39231]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:51:23 compute-0 sudo[39229]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:24 compute-0 sudo[39352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syubmayvsittttgynilnrlnnkcmngnwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223483.3425484-378-52120194298068/AnsiballZ_copy.py'
Dec 08 19:51:24 compute-0 sudo[39352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:24 compute-0 python3.9[39354]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223483.3425484-378-52120194298068/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:51:24 compute-0 sudo[39352]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:25 compute-0 sudo[39504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pexpfviaiwdeaxvslfglaujvectsisjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223484.9355168-396-186974971518195/AnsiballZ_dnf.py'
Dec 08 19:51:25 compute-0 sudo[39504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:25 compute-0 python3.9[39506]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:51:33 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:51:33 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:51:33 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:51:33 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:51:33 compute-0 systemd[1]: Reloading.
Dec 08 19:51:33 compute-0 systemd-rc-local-generator[39579]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:51:33 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:51:34 compute-0 sudo[39504]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:35 compute-0 python3.9[41077]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:51:36 compute-0 python3.9[42036]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 08 19:51:36 compute-0 python3.9[42858]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:51:37 compute-0 sudo[43705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfajuhumgntvjbajlkpwxzyurelabuew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223497.06205-435-78422761556798/AnsiballZ_command.py'
Dec 08 19:51:37 compute-0 sudo[43705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:37 compute-0 python3.9[43707]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:37 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 08 19:51:37 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:51:37 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:51:37 compute-0 systemd[1]: man-db-cache-update.service: Consumed 4.634s CPU time.
Dec 08 19:51:37 compute-0 systemd[1]: run-r0ca087d59c364122bafa3fd6c00f29e9.service: Deactivated successfully.
Dec 08 19:51:38 compute-0 systemd[1]: Starting Authorization Manager...
Dec 08 19:51:38 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 08 19:51:38 compute-0 polkitd[43925]: Started polkitd version 0.117
Dec 08 19:51:38 compute-0 polkitd[43925]: Loading rules from directory /etc/polkit-1/rules.d
Dec 08 19:51:38 compute-0 polkitd[43925]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 08 19:51:38 compute-0 polkitd[43925]: Finished loading, compiling and executing 2 rules
Dec 08 19:51:38 compute-0 polkitd[43925]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Dec 08 19:51:38 compute-0 systemd[1]: Started Authorization Manager.
Dec 08 19:51:38 compute-0 sudo[43705]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:38 compute-0 sudo[44094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jaqnkckxylehttozvdccmikhmxfjfwsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223498.4523067-444-250613890154860/AnsiballZ_systemd.py'
Dec 08 19:51:38 compute-0 sudo[44094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:39 compute-0 python3.9[44096]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:51:39 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 08 19:51:39 compute-0 sshd-session[44097]: Invalid user cheeki from 172.190.42.55 port 59380
Dec 08 19:51:39 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Dec 08 19:51:39 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 08 19:51:39 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 08 19:51:39 compute-0 sshd-session[44097]: Received disconnect from 172.190.42.55 port 59380:11: Bye Bye [preauth]
Dec 08 19:51:39 compute-0 sshd-session[44097]: Disconnected from invalid user cheeki 172.190.42.55 port 59380 [preauth]
Dec 08 19:51:39 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 08 19:51:39 compute-0 sudo[44094]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:39 compute-0 python3.9[44259]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 08 19:51:42 compute-0 sudo[44409]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahcwvfknzlrcuixbigksdfmwmufzugej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223501.744167-501-229365438674070/AnsiballZ_systemd.py'
Dec 08 19:51:42 compute-0 sudo[44409]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:42 compute-0 python3.9[44411]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:51:42 compute-0 systemd[1]: Reloading.
Dec 08 19:51:42 compute-0 systemd-rc-local-generator[44441]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:51:42 compute-0 sudo[44409]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:43 compute-0 sudo[44598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyzyjofpielgdoovyfelfdvykufjqdws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223502.7558124-501-173635481753319/AnsiballZ_systemd.py'
Dec 08 19:51:43 compute-0 sudo[44598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:43 compute-0 python3.9[44600]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:51:43 compute-0 systemd[1]: Reloading.
Dec 08 19:51:43 compute-0 systemd-rc-local-generator[44626]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:51:43 compute-0 sudo[44598]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:44 compute-0 sudo[44787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pydnyuvhlnqzegdevpqkmdylpvpmfnkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223503.839623-517-170455011437034/AnsiballZ_command.py'
Dec 08 19:51:44 compute-0 sudo[44787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:44 compute-0 python3.9[44789]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:44 compute-0 sudo[44787]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:44 compute-0 sudo[44940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kmzvbgjkszxtniaxsbfzbmvmonjsvled ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223504.6250515-525-186535477018582/AnsiballZ_command.py'
Dec 08 19:51:44 compute-0 sudo[44940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:45 compute-0 python3.9[44942]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:45 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 08 19:51:45 compute-0 sudo[44940]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:45 compute-0 sudo[45093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjlesybldswfevsqancugilmlhvflyiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223505.2616851-533-125272339469297/AnsiballZ_command.py'
Dec 08 19:51:45 compute-0 sudo[45093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:45 compute-0 python3.9[45095]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:47 compute-0 sudo[45093]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:47 compute-0 sudo[45255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukzohelnehugmwxslisztiurqhxxoebf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223507.4047477-541-143396217744474/AnsiballZ_command.py'
Dec 08 19:51:47 compute-0 sudo[45255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:47 compute-0 python3.9[45257]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:47 compute-0 sudo[45255]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:48 compute-0 sudo[45408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktpzzdjwtablwavjytcsbzeqnudzuajr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223508.109922-549-229378942659622/AnsiballZ_systemd.py'
Dec 08 19:51:48 compute-0 sudo[45408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:48 compute-0 python3.9[45410]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:51:48 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 08 19:51:48 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Dec 08 19:51:48 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Dec 08 19:51:48 compute-0 systemd[1]: Starting Apply Kernel Variables...
Dec 08 19:51:48 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 08 19:51:48 compute-0 systemd[1]: Finished Apply Kernel Variables.
Dec 08 19:51:48 compute-0 sudo[45408]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:49 compute-0 sshd-session[31657]: Connection closed by 192.168.122.30 port 36564
Dec 08 19:51:49 compute-0 sshd-session[31654]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:51:49 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Dec 08 19:51:49 compute-0 systemd[1]: session-9.scope: Consumed 2min 17.067s CPU time.
Dec 08 19:51:49 compute-0 systemd-logind[793]: Session 9 logged out. Waiting for processes to exit.
Dec 08 19:51:49 compute-0 systemd-logind[793]: Removed session 9.
Dec 08 19:51:54 compute-0 sshd-session[45441]: Accepted publickey for zuul from 192.168.122.30 port 56966 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:51:54 compute-0 systemd-logind[793]: New session 10 of user zuul.
Dec 08 19:51:54 compute-0 systemd[1]: Started Session 10 of User zuul.
Dec 08 19:51:54 compute-0 sshd-session[45441]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:51:55 compute-0 python3.9[45594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:51:56 compute-0 python3.9[45748]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:51:57 compute-0 sudo[45902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shpowwadwdyuavsaabxpjdllhpedpebd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223517.1100392-50-154907742533452/AnsiballZ_command.py'
Dec 08 19:51:57 compute-0 sudo[45902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:57 compute-0 python3.9[45904]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:51:57 compute-0 sudo[45902]: pam_unix(sudo:session): session closed for user root
Dec 08 19:51:58 compute-0 python3.9[46055]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:51:59 compute-0 sudo[46209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpzglehxcvaudqthwmeigxofpjglnhkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223518.9891615-70-219719305185639/AnsiballZ_setup.py'
Dec 08 19:51:59 compute-0 sudo[46209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:51:59 compute-0 python3.9[46211]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:51:59 compute-0 sudo[46209]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:00 compute-0 sudo[46293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrlsutfzghcwnuwcqjicdigftyyraxzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223518.9891615-70-219719305185639/AnsiballZ_dnf.py'
Dec 08 19:52:00 compute-0 sudo[46293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:00 compute-0 python3.9[46295]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:52:01 compute-0 sudo[46293]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:02 compute-0 sudo[46446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlhfbglaitoiozvduzvsorpyedtqrzcy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223522.1044888-82-169094894370603/AnsiballZ_setup.py'
Dec 08 19:52:02 compute-0 sudo[46446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:02 compute-0 python3.9[46448]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:52:02 compute-0 sudo[46446]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:03 compute-0 sudo[46617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxqkkawkcibsvlhgmujicpasvvibplyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223523.0594714-93-57670065812867/AnsiballZ_file.py'
Dec 08 19:52:03 compute-0 sudo[46617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:03 compute-0 python3.9[46619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:52:03 compute-0 sudo[46617]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:04 compute-0 sudo[46769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmksahzvtmxmianscbpbyaisgcyknzzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223523.8307145-101-150036927681402/AnsiballZ_command.py'
Dec 08 19:52:04 compute-0 sudo[46769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:04 compute-0 python3.9[46771]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:52:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat938610445-merged.mount: Deactivated successfully.
Dec 08 19:52:04 compute-0 podman[46772]: 2025-12-08 19:52:04.348934006 +0000 UTC m=+0.067586727 system refresh
Dec 08 19:52:04 compute-0 sudo[46769]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:05 compute-0 sudo[46932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdreixouwpspmazexsgvalpayrrpqred ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223524.5657403-109-40417243581256/AnsiballZ_stat.py'
Dec 08 19:52:05 compute-0 sudo[46932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:05 compute-0 python3.9[46934]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:52:05 compute-0 sudo[46932]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:05 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:52:05 compute-0 sudo[47055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhuqwewaedwihiwshpcsizxapwwbfriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223524.5657403-109-40417243581256/AnsiballZ_copy.py'
Dec 08 19:52:05 compute-0 sudo[47055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:05 compute-0 python3.9[47057]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223524.5657403-109-40417243581256/.source.json follow=False _original_basename=podman_network_config.j2 checksum=fc079764b710abbcfa8cc14592db6b7d653e1963 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:52:05 compute-0 sudo[47055]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:06 compute-0 sudo[47207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivfyifovwhtcfjaxudtagjqhwypsqouy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223526.050251-124-24042697651545/AnsiballZ_stat.py'
Dec 08 19:52:06 compute-0 sudo[47207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:06 compute-0 python3.9[47209]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:52:06 compute-0 sudo[47207]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:06 compute-0 sudo[47330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdqqsnylbjgbdmrezrqwxpdipycsang ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223526.050251-124-24042697651545/AnsiballZ_copy.py'
Dec 08 19:52:06 compute-0 sudo[47330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:07 compute-0 python3.9[47332]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223526.050251-124-24042697651545/.source.conf follow=False _original_basename=registries.conf.j2 checksum=a92d4bce7d9cad3a31d9a297b9e21f629ee446cd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:52:07 compute-0 sudo[47330]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:07 compute-0 sudo[47482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wchrafzqtqxqqcieapuvdnvveivsqxiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223527.3507879-140-248774797861390/AnsiballZ_ini_file.py'
Dec 08 19:52:07 compute-0 sudo[47482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:07 compute-0 python3.9[47484]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:52:08 compute-0 sudo[47482]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:08 compute-0 sudo[47634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhpgnlkfvfkimkreqcgtroaazwvdbdvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223528.1983352-140-61339428175148/AnsiballZ_ini_file.py'
Dec 08 19:52:08 compute-0 sudo[47634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:08 compute-0 python3.9[47636]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:52:08 compute-0 sudo[47634]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:09 compute-0 sudo[47786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrsqrzgaxinvtkwcymfyhbasqqyevmha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223528.8244972-140-169927030958247/AnsiballZ_ini_file.py'
Dec 08 19:52:09 compute-0 sudo[47786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:09 compute-0 python3.9[47788]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:52:09 compute-0 sudo[47786]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:09 compute-0 sudo[47938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vraimokxljueyasymdvcphlyibujorhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223529.4314697-140-281387620040832/AnsiballZ_ini_file.py'
Dec 08 19:52:09 compute-0 sudo[47938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:09 compute-0 python3.9[47940]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:52:09 compute-0 sudo[47938]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:10 compute-0 python3.9[48090]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:52:11 compute-0 sudo[48242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfswmnrajmjyekpnoasiepnqvihwuogb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223530.9898577-180-183470630792367/AnsiballZ_dnf.py'
Dec 08 19:52:11 compute-0 sudo[48242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:11 compute-0 python3.9[48244]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:12 compute-0 sudo[48242]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:12 compute-0 sshd-session[48246]: Invalid user socks from 159.223.8.81 port 43112
Dec 08 19:52:12 compute-0 sshd-session[48246]: Received disconnect from 159.223.8.81 port 43112:11: Bye Bye [preauth]
Dec 08 19:52:12 compute-0 sshd-session[48246]: Disconnected from invalid user socks 159.223.8.81 port 43112 [preauth]
Dec 08 19:52:13 compute-0 sudo[48397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saoyfwtwxrutfcztarweyantuqjnqwsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223532.974869-188-180516033600558/AnsiballZ_dnf.py'
Dec 08 19:52:13 compute-0 sudo[48397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:13 compute-0 python3.9[48399]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:15 compute-0 sudo[48397]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:15 compute-0 sudo[48557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlksvrgptrcfwsbsamngrbwpgqsnaezw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223535.4613822-198-249731425648906/AnsiballZ_dnf.py'
Dec 08 19:52:15 compute-0 sudo[48557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:15 compute-0 python3.9[48559]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:17 compute-0 sudo[48557]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:17 compute-0 sudo[48710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snpfmyesivzbpijqqknpepfuvttviebv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223537.4081469-207-61835806308762/AnsiballZ_dnf.py'
Dec 08 19:52:17 compute-0 sudo[48710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:17 compute-0 python3.9[48712]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:19 compute-0 sudo[48710]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:19 compute-0 sudo[48863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fadgysewdlmxoufszpmotjkmwdeokfmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223539.5799148-218-246499889843893/AnsiballZ_dnf.py'
Dec 08 19:52:19 compute-0 sudo[48863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:20 compute-0 python3.9[48865]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:24 compute-0 sudo[48863]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:25 compute-0 sudo[49037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixriuvehkjdaieimqkxvuqkbwttybgre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223544.87143-226-66997278113619/AnsiballZ_dnf.py'
Dec 08 19:52:25 compute-0 sudo[49037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:25 compute-0 python3.9[49039]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:28 compute-0 sudo[49037]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:28 compute-0 sudo[49212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgtaavjgucdoghwnnajpzffiksjzltuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223548.6390383-235-237199940159531/AnsiballZ_dnf.py'
Dec 08 19:52:28 compute-0 sudo[49212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:29 compute-0 python3.9[49214]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:30 compute-0 sudo[49212]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:31 compute-0 sudo[49365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmrlygungrbpdmnaiuahmioaxgtzihtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223550.731977-244-21063553320700/AnsiballZ_dnf.py'
Dec 08 19:52:31 compute-0 sudo[49365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:31 compute-0 python3.9[49367]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:54 compute-0 sshd-session[49511]: Received disconnect from 172.190.42.55 port 46568:11: Bye Bye [preauth]
Dec 08 19:52:54 compute-0 sshd-session[49511]: Disconnected from authenticating user root 172.190.42.55 port 46568 [preauth]
Dec 08 19:52:56 compute-0 sudo[49365]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:56 compute-0 sudo[49719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kaaooavrwfvhyrbbuzcmcycvjiukosqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223576.2855973-253-276030513888157/AnsiballZ_dnf.py'
Dec 08 19:52:56 compute-0 sudo[49719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:57 compute-0 python3.9[49721]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:52:58 compute-0 sudo[49719]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:59 compute-0 sudo[49875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yixcpvryldqrgjyuwuzetwqdxwpbatsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223579.0063796-264-179103303680727/AnsiballZ_file.py'
Dec 08 19:52:59 compute-0 sudo[49875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:52:59 compute-0 python3.9[49877]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:52:59 compute-0 sudo[49875]: pam_unix(sudo:session): session closed for user root
Dec 08 19:52:59 compute-0 sudo[50050]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhakjdhqsrpemkciarvxvpgbcrciqjyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223579.6439826-272-62231530132755/AnsiballZ_stat.py'
Dec 08 19:52:59 compute-0 sudo[50050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:00 compute-0 python3.9[50052]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:53:00 compute-0 sudo[50050]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:00 compute-0 sudo[50173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmewqjjmxypltbwqphmyyqtztlnxrgfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223579.6439826-272-62231530132755/AnsiballZ_copy.py'
Dec 08 19:53:00 compute-0 sudo[50173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:00 compute-0 python3.9[50175]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765223579.6439826-272-62231530132755/.source.json _original_basename=.c1okqdm2 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:53:00 compute-0 sudo[50173]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:01 compute-0 sudo[50325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmnsmfggnfowawgxunkahmqjklxovaad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223581.2023385-290-14944848357491/AnsiballZ_podman_image.py'
Dec 08 19:53:01 compute-0 sudo[50325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:02 compute-0 python3.9[50327]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:53:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:02 compute-0 sshd-session[49391]: Connection closed by 66.132.153.118 port 17790 [preauth]
Dec 08 19:53:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat3490616210-lower\x2dmapped.mount: Deactivated successfully.
Dec 08 19:53:07 compute-0 sshd-session[50388]: Invalid user wwwroot from 222.172.32.246 port 2173
Dec 08 19:53:07 compute-0 sshd-session[50388]: Received disconnect from 222.172.32.246 port 2173:11: Bye Bye [preauth]
Dec 08 19:53:07 compute-0 sshd-session[50388]: Disconnected from invalid user wwwroot 222.172.32.246 port 2173 [preauth]
Dec 08 19:53:09 compute-0 irqbalance[785]: Cannot change IRQ 26 affinity: Operation not permitted
Dec 08 19:53:09 compute-0 irqbalance[785]: IRQ 26 affinity is now unmanaged
Dec 08 19:53:11 compute-0 podman[50339]: 2025-12-08 19:53:11.088377852 +0000 UTC m=+8.960061944 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 08 19:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:11 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:11 compute-0 sudo[50325]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:11 compute-0 sudo[50633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvwcopombdrbbxnzztcksoywspnasevp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223591.596406-301-46448617291450/AnsiballZ_podman_image.py'
Dec 08 19:53:11 compute-0 sudo[50633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:12 compute-0 python3.9[50635]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:53:12 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:12 compute-0 sshd-session[50661]: Invalid user ftpuser from 159.223.8.81 port 35290
Dec 08 19:53:12 compute-0 sshd-session[50661]: Received disconnect from 159.223.8.81 port 35290:11: Bye Bye [preauth]
Dec 08 19:53:12 compute-0 sshd-session[50661]: Disconnected from invalid user ftpuser 159.223.8.81 port 35290 [preauth]
Dec 08 19:53:18 compute-0 sshd[1006]: Timeout before authentication for connection from 101.47.160.247 to 38.102.83.66, pid = 38152
Dec 08 19:53:27 compute-0 podman[50648]: 2025-12-08 19:53:27.249204456 +0000 UTC m=+15.116783742 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 19:53:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:27 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:27 compute-0 sudo[50633]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:28 compute-0 sudo[50942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpgwcnocfqokbaomfnzqmnugqrprfazm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223607.7309482-311-197864843652161/AnsiballZ_podman_image.py'
Dec 08 19:53:28 compute-0 sudo[50942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:28 compute-0 python3.9[50944]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:53:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:29 compute-0 podman[50956]: 2025-12-08 19:53:29.297454385 +0000 UTC m=+1.022438927 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 08 19:53:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:29 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:30 compute-0 sudo[50942]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:30 compute-0 sudo[51187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dogwcnexztmpssjpqiemszbeipcetbeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223610.4031208-320-139085199578627/AnsiballZ_podman_image.py'
Dec 08 19:53:30 compute-0 sudo[51187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:30 compute-0 python3.9[51189]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:53:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:32 compute-0 sshd[1006]: drop connection #0 from [101.47.160.247]:50360 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 19:53:43 compute-0 podman[51201]: 2025-12-08 19:53:43.769199546 +0000 UTC m=+12.853391941 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 08 19:53:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:43 compute-0 sudo[51187]: pam_unix(sudo:session): session closed for user root
Dec 08 19:53:44 compute-0 sudo[51454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stclqbzlndgycgfgxavedjhfcdffuxqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223624.4180238-331-151431438674418/AnsiballZ_podman_image.py'
Dec 08 19:53:44 compute-0 sudo[51454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:53:44 compute-0 python3.9[51456]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:53:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:59 compute-0 podman[51468]: 2025-12-08 19:53:59.630151784 +0000 UTC m=+14.661576092 image pull b1b6d71b432c07886b3bae74df4dc9841d1f26407d5f96d6c1e400b0154d9a3d quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested
Dec 08 19:53:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:59 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:53:59 compute-0 sudo[51454]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:00 compute-0 sudo[51786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oboabtqptrmoynyvxrbmvmwajuckdmkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223640.0137365-331-53376684692343/AnsiballZ_podman_image.py'
Dec 08 19:54:00 compute-0 sudo[51786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:00 compute-0 python3.9[51788]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 08 19:54:01 compute-0 podman[51801]: 2025-12-08 19:54:01.726421178 +0000 UTC m=+1.170441553 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 08 19:54:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:54:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:54:01 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:54:01 compute-0 sudo[51786]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:02 compute-0 sshd-session[45444]: Connection closed by 192.168.122.30 port 56966
Dec 08 19:54:02 compute-0 sshd-session[45441]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:54:02 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Dec 08 19:54:02 compute-0 systemd[1]: session-10.scope: Consumed 2min 16.780s CPU time.
Dec 08 19:54:02 compute-0 systemd-logind[793]: Session 10 logged out. Waiting for processes to exit.
Dec 08 19:54:02 compute-0 systemd-logind[793]: Removed session 10.
Dec 08 19:54:03 compute-0 sshd-session[51950]: Received disconnect from 193.46.255.7 port 54316:11:  [preauth]
Dec 08 19:54:03 compute-0 sshd-session[51950]: Disconnected from authenticating user root 193.46.255.7 port 54316 [preauth]
Dec 08 19:54:08 compute-0 sshd-session[51952]: Accepted publickey for zuul from 192.168.122.30 port 39626 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:54:08 compute-0 systemd-logind[793]: New session 11 of user zuul.
Dec 08 19:54:08 compute-0 systemd[1]: Started Session 11 of User zuul.
Dec 08 19:54:08 compute-0 sshd-session[51952]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:54:09 compute-0 python3.9[52105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:54:10 compute-0 sudo[52259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwucjoosjvlcwpzzupzosvradfkudufo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223649.728523-36-197093643625404/AnsiballZ_getent.py'
Dec 08 19:54:10 compute-0 sudo[52259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:10 compute-0 python3.9[52261]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 08 19:54:10 compute-0 sudo[52259]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:10 compute-0 sudo[52412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahhgjtaiqodjcmupnfalayggluthhoiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223650.4960098-44-42843765366292/AnsiballZ_group.py'
Dec 08 19:54:10 compute-0 sudo[52412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:11 compute-0 python3.9[52414]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 19:54:11 compute-0 groupadd[52415]: group added to /etc/group: name=openvswitch, GID=42476
Dec 08 19:54:11 compute-0 groupadd[52415]: group added to /etc/gshadow: name=openvswitch
Dec 08 19:54:11 compute-0 groupadd[52415]: new group: name=openvswitch, GID=42476
Dec 08 19:54:11 compute-0 sudo[52412]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:11 compute-0 sudo[52570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zppwnzzbhlgnkrkmqekqmzrnxngmcpbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223651.3333983-52-1194983920247/AnsiballZ_user.py'
Dec 08 19:54:11 compute-0 sudo[52570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:12 compute-0 python3.9[52572]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 08 19:54:12 compute-0 useradd[52574]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Dec 08 19:54:12 compute-0 useradd[52574]: add 'openvswitch' to group 'hugetlbfs'
Dec 08 19:54:12 compute-0 useradd[52574]: add 'openvswitch' to shadow group 'hugetlbfs'
Dec 08 19:54:12 compute-0 sudo[52570]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:12 compute-0 sudo[52730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdgnidfeyeebtgquxpqqvsloyepfbnbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223652.3771348-62-30111977793484/AnsiballZ_setup.py'
Dec 08 19:54:12 compute-0 sudo[52730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:12 compute-0 python3.9[52732]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:54:13 compute-0 sudo[52730]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:13 compute-0 sudo[52814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eddiaomgrfonrcmygocrdpczythucjnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223652.3771348-62-30111977793484/AnsiballZ_dnf.py'
Dec 08 19:54:13 compute-0 sudo[52814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:13 compute-0 python3.9[52816]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:54:14 compute-0 sshd-session[52818]: Received disconnect from 172.190.42.55 port 33032:11: Bye Bye [preauth]
Dec 08 19:54:14 compute-0 sshd-session[52818]: Disconnected from authenticating user root 172.190.42.55 port 33032 [preauth]
Dec 08 19:54:15 compute-0 sudo[52814]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:15 compute-0 sshd-session[52820]: Invalid user wwwroot from 159.223.8.81 port 36398
Dec 08 19:54:15 compute-0 sshd-session[52820]: Received disconnect from 159.223.8.81 port 36398:11: Bye Bye [preauth]
Dec 08 19:54:15 compute-0 sshd-session[52820]: Disconnected from invalid user wwwroot 159.223.8.81 port 36398 [preauth]
Dec 08 19:54:16 compute-0 sudo[52980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yejjirrxpbhmjajpcznthexnkxbyfoke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223655.7223508-76-152344120359647/AnsiballZ_dnf.py'
Dec 08 19:54:16 compute-0 sudo[52980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:16 compute-0 python3.9[52982]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:54:35 compute-0 kernel: SELinux:  Converting 2731 SID table entries...
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:54:35 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:54:35 compute-0 groupadd[53006]: group added to /etc/group: name=unbound, GID=993
Dec 08 19:54:35 compute-0 groupadd[53006]: group added to /etc/gshadow: name=unbound
Dec 08 19:54:35 compute-0 groupadd[53006]: new group: name=unbound, GID=993
Dec 08 19:54:35 compute-0 useradd[53013]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Dec 08 19:54:35 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 08 19:54:35 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 08 19:54:36 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:54:36 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:54:36 compute-0 systemd[1]: Reloading.
Dec 08 19:54:36 compute-0 systemd-rc-local-generator[53512]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:54:36 compute-0 systemd-sysv-generator[53515]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:54:36 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:54:37 compute-0 sudo[52980]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:38 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:54:38 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:54:38 compute-0 systemd[1]: run-rcb697dd29271463685deadffcce13b2e.service: Deactivated successfully.
Dec 08 19:54:38 compute-0 sudo[54079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udmmkngmblmuuwjevhxaamrctanqrsiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223677.7035336-84-58407826734883/AnsiballZ_systemd.py'
Dec 08 19:54:38 compute-0 sudo[54079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:38 compute-0 python3.9[54081]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 19:54:38 compute-0 systemd[1]: Reloading.
Dec 08 19:54:38 compute-0 systemd-rc-local-generator[54112]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:54:38 compute-0 systemd-sysv-generator[54115]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:54:38 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Dec 08 19:54:38 compute-0 chown[54123]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 08 19:54:39 compute-0 ovs-ctl[54128]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 08 19:54:39 compute-0 ovs-ctl[54128]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 08 19:54:39 compute-0 ovs-ctl[54128]: Starting ovsdb-server [  OK  ]
Dec 08 19:54:39 compute-0 ovs-vsctl[54177]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 08 19:54:39 compute-0 ovs-vsctl[54197]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"7a8539fb-8779-42f7-8fa8-222db61ea5ae\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 08 19:54:39 compute-0 ovs-ctl[54128]: Configuring Open vSwitch system IDs [  OK  ]
Dec 08 19:54:39 compute-0 ovs-ctl[54128]: Enabling remote OVSDB managers [  OK  ]
Dec 08 19:54:39 compute-0 ovs-vsctl[54203]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 08 19:54:39 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Dec 08 19:54:39 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 08 19:54:39 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 08 19:54:39 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 08 19:54:39 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Dec 08 19:54:39 compute-0 ovs-ctl[54247]: Inserting openvswitch module [  OK  ]
Dec 08 19:54:39 compute-0 ovs-ctl[54216]: Starting ovs-vswitchd [  OK  ]
Dec 08 19:54:39 compute-0 ovs-vsctl[54264]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Dec 08 19:54:39 compute-0 ovs-ctl[54216]: Enabling remote OVSDB managers [  OK  ]
Dec 08 19:54:39 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 08 19:54:39 compute-0 systemd[1]: Starting Open vSwitch...
Dec 08 19:54:39 compute-0 systemd[1]: Finished Open vSwitch.
Dec 08 19:54:39 compute-0 sudo[54079]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:40 compute-0 python3.9[54416]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:54:41 compute-0 sudo[54566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuyzcneeuavfupchmdslvzilahdacykk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223680.7129972-102-97313427150329/AnsiballZ_sefcontext.py'
Dec 08 19:54:41 compute-0 sudo[54566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:41 compute-0 python3.9[54568]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 08 19:54:43 compute-0 kernel: SELinux:  Converting 2745 SID table entries...
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 19:54:43 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 19:54:43 compute-0 sudo[54566]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:44 compute-0 python3.9[54723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:54:45 compute-0 sudo[54879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqteesrtaateckdszebgfwnoibdbltel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223685.2752802-120-243794735672073/AnsiballZ_dnf.py'
Dec 08 19:54:45 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 08 19:54:45 compute-0 sudo[54879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:45 compute-0 python3.9[54881]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:54:47 compute-0 sudo[54879]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:47 compute-0 sudo[55032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hdfvjxycrtidossjgnnegsqazlyevrzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223687.2297413-128-21454470225026/AnsiballZ_command.py'
Dec 08 19:54:47 compute-0 sudo[55032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:47 compute-0 python3.9[55034]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:54:48 compute-0 sudo[55032]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:49 compute-0 sudo[55319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtftigzlvbugkbuqbuymoyegibjqzxmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223688.890738-136-186260634486083/AnsiballZ_file.py'
Dec 08 19:54:49 compute-0 sudo[55319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:49 compute-0 python3.9[55321]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 08 19:54:49 compute-0 sudo[55319]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:50 compute-0 python3.9[55471]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:54:51 compute-0 sudo[55623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elynaqxrveifyeihxbkijwtgpyrmuvrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223690.674912-152-23498140070125/AnsiballZ_dnf.py'
Dec 08 19:54:51 compute-0 sudo[55623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:51 compute-0 python3.9[55625]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:54:55 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:54:56 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:54:56 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 19:54:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:54:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:54:56 compute-0 systemd[1]: Reloading.
Dec 08 19:54:56 compute-0 systemd-rc-local-generator[55706]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:54:56 compute-0 systemd-sysv-generator[55710]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:54:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:54:57 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:54:57 compute-0 sudo[55623]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:54:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:54:57 compute-0 systemd[1]: run-r53f40b59fbc548c7a1edeacbdf823341.service: Deactivated successfully.
Dec 08 19:54:57 compute-0 systemd[1]: run-rd6ad948d6a1241e79bcc14bfc873a05a.service: Deactivated successfully.
Dec 08 19:54:58 compute-0 sudo[56209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsprcsthoionikujmuoxnzixwvyiogua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223697.8206046-160-88992452202697/AnsiballZ_systemd.py'
Dec 08 19:54:58 compute-0 sudo[56209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:58 compute-0 python3.9[56211]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:54:58 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 08 19:54:58 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Dec 08 19:54:58 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Dec 08 19:54:58 compute-0 systemd[1]: Stopping Network Manager...
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.4559] caught SIGTERM, shutting down normally.
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.4580] dhcp4 (eth0): canceled DHCP transaction
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.4581] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.4581] dhcp4 (eth0): state changed no lease
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.4583] manager: NetworkManager state is now CONNECTED_SITE
Dec 08 19:54:58 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:54:58 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:54:58 compute-0 NetworkManager[7186]: <info>  [1765223698.6633] exiting (success)
Dec 08 19:54:58 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 08 19:54:58 compute-0 systemd[1]: Stopped Network Manager.
Dec 08 19:54:58 compute-0 systemd[1]: NetworkManager.service: Consumed 14.630s CPU time, 4.1M memory peak, read 0B from disk, written 20.5K to disk.
Dec 08 19:54:58 compute-0 systemd[1]: Starting Network Manager...
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.7447] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:dbd0b2df-41a2-4b72-b337-1b1fd8346088)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.7451] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.7548] manager[0x5651458f0000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 08 19:54:58 compute-0 systemd[1]: Starting Hostname Service...
Dec 08 19:54:58 compute-0 systemd[1]: Started Hostname Service.
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8474] hostname: hostname: using hostnamed
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8475] hostname: static hostname changed from (none) to "compute-0"
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8481] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8486] manager[0x5651458f0000]: rfkill: Wi-Fi hardware radio set enabled
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8487] manager[0x5651458f0000]: rfkill: WWAN hardware radio set enabled
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8512] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-ovs.so)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8522] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8523] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8523] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8524] manager: Networking is enabled by state file
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8526] settings: Loaded settings plugin: keyfile (internal)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8529] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8553] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8561] dhcp: init: Using DHCP client 'internal'
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8564] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8569] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8574] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8582] device (lo): Activation: starting connection 'lo' (da4f7c5b-a714-4e67-a816-06de13118f80)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8588] device (eth0): carrier: link connected
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8593] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8596] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8597] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8602] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8608] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8612] device (eth1): carrier: link connected
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8617] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8622] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (707dcd72-95a2-510c-83a0-f85b3c0b91a1) (indicated)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8622] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8627] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8635] device (eth1): Activation: starting connection 'ci-private-network' (707dcd72-95a2-510c-83a0-f85b3c0b91a1)
Dec 08 19:54:58 compute-0 systemd[1]: Started Network Manager.
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8642] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8653] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8657] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8659] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8661] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8664] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8667] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8671] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8677] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8684] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8687] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8694] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8706] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8715] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8716] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8720] device (lo): Activation: successful, device activated.
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8726] dhcp4 (eth0): state changed new lease, address=38.102.83.66
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8733] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 08 19:54:58 compute-0 systemd[1]: Starting Network Manager Wait Online...
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8826] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8835] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8845] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8854] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8860] device (eth1): Activation: successful, device activated.
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8875] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8878] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8883] manager: NetworkManager state is now CONNECTED_SITE
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8890] device (eth0): Activation: successful, device activated.
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8897] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 08 19:54:58 compute-0 NetworkManager[56229]: <info>  [1765223698.8903] manager: startup complete
Dec 08 19:54:58 compute-0 sudo[56209]: pam_unix(sudo:session): session closed for user root
Dec 08 19:54:58 compute-0 systemd[1]: Finished Network Manager Wait Online.
Dec 08 19:54:59 compute-0 sudo[56436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reikytolwywjeendtgktyicdcnouxgnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223699.1105056-168-239764013554008/AnsiballZ_dnf.py'
Dec 08 19:54:59 compute-0 sudo[56436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:54:59 compute-0 python3.9[56438]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:55:07 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 19:55:07 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 19:55:07 compute-0 systemd[1]: Reloading.
Dec 08 19:55:07 compute-0 systemd-rc-local-generator[56495]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:55:07 compute-0 systemd-sysv-generator[56498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:55:07 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 19:55:09 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:55:10 compute-0 sudo[56436]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:10 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 19:55:10 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 19:55:10 compute-0 systemd[1]: run-r03a147d5cdc74816a92a1e23fe29cf8c.service: Deactivated successfully.
Dec 08 19:55:10 compute-0 sudo[56901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odjpavycfbgekinjvifsvxizzhxwjlgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223710.3373997-180-69146634411722/AnsiballZ_stat.py'
Dec 08 19:55:10 compute-0 sudo[56901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:10 compute-0 python3.9[56903]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:55:10 compute-0 sudo[56901]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:11 compute-0 sudo[57053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajrzarnpuzvnjwlipemfvnsukbasulkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223711.0291927-189-178202123117553/AnsiballZ_ini_file.py'
Dec 08 19:55:11 compute-0 sudo[57053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:11 compute-0 python3.9[57055]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:11 compute-0 sudo[57053]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:12 compute-0 sudo[57207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmxtlylfyygflosbvscxdzjkvpqkfanp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223712.0035517-199-85150389569218/AnsiballZ_ini_file.py'
Dec 08 19:55:12 compute-0 sudo[57207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:12 compute-0 python3.9[57209]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:12 compute-0 sudo[57207]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:12 compute-0 sudo[57359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhnhrjbtzmkkzffqhpimtoxdjksylkys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223712.6492903-199-217117536486734/AnsiballZ_ini_file.py'
Dec 08 19:55:12 compute-0 sudo[57359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:13 compute-0 python3.9[57361]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:13 compute-0 sudo[57359]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:13 compute-0 sudo[57511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqxutidxgruiwwvvrmovecutiajzhjbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223713.271311-214-17829432535570/AnsiballZ_ini_file.py'
Dec 08 19:55:13 compute-0 sudo[57511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:13 compute-0 python3.9[57513]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:13 compute-0 sudo[57511]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:14 compute-0 sudo[57663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agoajhzynupmwovmaanzkqmfeybbvrto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223713.8772244-214-46342462678573/AnsiballZ_ini_file.py'
Dec 08 19:55:14 compute-0 sudo[57663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:14 compute-0 python3.9[57665]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:14 compute-0 sudo[57663]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:14 compute-0 sudo[57815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hrqhoeukdiclihahokrosqajuunjzwpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223714.4993494-229-172116588585125/AnsiballZ_stat.py'
Dec 08 19:55:14 compute-0 sudo[57815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:14 compute-0 python3.9[57817]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:55:14 compute-0 sudo[57815]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:15 compute-0 sudo[57938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yunbvgmbscnkrsiehxakmrjwafsqocft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223714.4993494-229-172116588585125/AnsiballZ_copy.py'
Dec 08 19:55:15 compute-0 sudo[57938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:15 compute-0 python3.9[57940]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223714.4993494-229-172116588585125/.source _original_basename=.633gouw_ follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:15 compute-0 sudo[57938]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:16 compute-0 sudo[58090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqbtfdlfwutrabafmannnfvwpizwiaox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223715.8584962-244-33293549246232/AnsiballZ_file.py'
Dec 08 19:55:16 compute-0 sudo[58090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:16 compute-0 python3.9[58092]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:16 compute-0 sudo[58090]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:17 compute-0 sudo[58242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhzrzqucfbkdyxedbzzkckweslofpdyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223716.6124415-252-265243766003136/AnsiballZ_edpm_os_net_config_mappings.py'
Dec 08 19:55:17 compute-0 sudo[58242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:17 compute-0 python3.9[58244]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 08 19:55:17 compute-0 sudo[58242]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:17 compute-0 sudo[58394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvymzmcdnwdtjjzmqiyelpxuivsesxyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223717.5123243-261-416205168244/AnsiballZ_file.py'
Dec 08 19:55:17 compute-0 sudo[58394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:17 compute-0 python3.9[58396]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:17 compute-0 sudo[58394]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:18 compute-0 sudo[58546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhvhmrweykkmesozezbhgdakpdohlzqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223718.3205907-271-260822355509800/AnsiballZ_stat.py'
Dec 08 19:55:18 compute-0 sudo[58546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:18 compute-0 sudo[58546]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:19 compute-0 sudo[58671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybdfgabwdqxqcjjzsilnmjnxwgndimth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223718.3205907-271-260822355509800/AnsiballZ_copy.py'
Dec 08 19:55:19 compute-0 sudo[58671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:19 compute-0 sshd-session[58572]: Invalid user amin from 159.223.8.81 port 56884
Dec 08 19:55:19 compute-0 sudo[58671]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:19 compute-0 sshd-session[58572]: Received disconnect from 159.223.8.81 port 56884:11: Bye Bye [preauth]
Dec 08 19:55:19 compute-0 sshd-session[58572]: Disconnected from invalid user amin 159.223.8.81 port 56884 [preauth]
Dec 08 19:55:19 compute-0 sudo[58823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owdpwakggwgyhlirzvcnmwozsinajtnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223719.5314088-286-124784993878686/AnsiballZ_slurp.py'
Dec 08 19:55:19 compute-0 sudo[58823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:20 compute-0 python3.9[58825]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 08 19:55:20 compute-0 sudo[58823]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:21 compute-0 sudo[58998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-takqtvqtjxgxtdjlgmzixnwzjgsbepsw ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223720.637245-295-113421292961582/async_wrapper.py j252538509123 300 /home/zuul/.ansible/tmp/ansible-tmp-1765223720.637245-295-113421292961582/AnsiballZ_edpm_os_net_config.py _'
Dec 08 19:55:21 compute-0 sudo[58998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:21 compute-0 ansible-async_wrapper.py[59000]: Invoked with j252538509123 300 /home/zuul/.ansible/tmp/ansible-tmp-1765223720.637245-295-113421292961582/AnsiballZ_edpm_os_net_config.py _
Dec 08 19:55:21 compute-0 ansible-async_wrapper.py[59003]: Starting module and watcher
Dec 08 19:55:21 compute-0 ansible-async_wrapper.py[59003]: Start watching 59004 (300)
Dec 08 19:55:21 compute-0 ansible-async_wrapper.py[59004]: Start module (59004)
Dec 08 19:55:21 compute-0 ansible-async_wrapper.py[59000]: Return async_wrapper task started.
Dec 08 19:55:21 compute-0 sudo[58998]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:21 compute-0 python3.9[59005]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 08 19:55:22 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 08 19:55:22 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 08 19:55:22 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 08 19:55:22 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 08 19:55:22 compute-0 kernel: cfg80211: failed to load regulatory.db
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.4623] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.4644] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5153] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5155] audit: op="connection-add" uuid="2eb2906d-ee1c-4d14-83d3-a4ed3d30d311" name="br-ex-br" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5170] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5171] audit: op="connection-add" uuid="f1e4bf6c-4721-4cab-952c-cb794e5b4ba7" name="br-ex-port" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5181] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5183] audit: op="connection-add" uuid="9761ab2a-b196-4237-8ab0-8c04e0ab7014" name="eth1-port" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5193] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5194] audit: op="connection-add" uuid="1535857f-404e-40a3-9f10-f1f6e2107fa4" name="vlan20-port" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5203] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5205] audit: op="connection-add" uuid="bb97102f-5757-4e30-9cf2-4baddf2b0018" name="vlan21-port" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5214] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5217] audit: op="connection-add" uuid="f6323f12-ad32-4587-8c16-ce1aa0b41ebd" name="vlan22-port" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5238] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5252] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.5253] audit: op="connection-add" uuid="2a24811b-134a-4a7f-941c-bf7e67efb2cf" name="br-ex-if" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6673] audit: op="connection-update" uuid="707dcd72-95a2-510c-83a0-f85b3c0b91a1" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv6.method,ipv6.addr-gen-mode,ipv6.routes,ipv6.routing-rules,ipv6.addresses,ipv6.dns,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.routing-rules,ipv4.addresses,ipv4.dns,connection.timestamp,connection.master,connection.port-type,connection.controller,connection.slave-type" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6692] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6696] audit: op="connection-add" uuid="2b5af080-d324-49cc-b285-ca483e8ca63b" name="vlan20-if" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6729] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6733] audit: op="connection-add" uuid="cb6a5fd0-efb5-428b-ac81-84bf0f2be651" name="vlan21-if" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6764] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6769] audit: op="connection-add" uuid="71c15232-28cd-4fbc-ac05-623e82229b20" name="vlan22-if" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6791] audit: op="connection-delete" uuid="ce215099-8d27-3935-a760-83a6a9e7b4be" name="Wired connection 1" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6815] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6822] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6842] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6852] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (2eb2906d-ee1c-4d14-83d3-a4ed3d30d311)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6855] audit: op="connection-activate" uuid="2eb2906d-ee1c-4d14-83d3-a4ed3d30d311" name="br-ex-br" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6861] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6864] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6876] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6885] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (f1e4bf6c-4721-4cab-952c-cb794e5b4ba7)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6890] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6893] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6904] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6913] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (9761ab2a-b196-4237-8ab0-8c04e0ab7014)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6918] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6922] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6934] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6944] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (1535857f-404e-40a3-9f10-f1f6e2107fa4)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6948] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6952] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6963] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6973] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (bb97102f-5757-4e30-9cf2-4baddf2b0018)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6978] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.6982] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.6994] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7004] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (f6323f12-ad32-4587-8c16-ce1aa0b41ebd)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7007] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7014] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7019] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7033] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.7036] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7044] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7053] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (2a24811b-134a-4a7f-941c-bf7e67efb2cf)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7055] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7058] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7060] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7062] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7063] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7073] device (eth1): disconnecting for new activation request.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7075] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7078] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7080] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7081] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7084] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.7085] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7087] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7091] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (2b5af080-d324-49cc-b285-ca483e8ca63b)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7093] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7095] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7097] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7100] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7103] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.7105] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7108] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7112] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (cb6a5fd0-efb5-428b-ac81-84bf0f2be651)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7114] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7117] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7119] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7120] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7123] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <warn>  [1765223723.7125] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7127] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7149] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (71c15232-28cd-4fbc-ac05-623e82229b20)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7150] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7154] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7157] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7159] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7161] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7181] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv6.method,ipv6.addr-gen-mode,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7184] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7189] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7191] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7200] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7204] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7209] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7214] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7216] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 kernel: ovs-system: entered promiscuous mode
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7233] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7237] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7241] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7243] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 kernel: Timeout policy base is empty
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7250] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7254] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 systemd-udevd[59013]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7258] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7261] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7269] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7276] dhcp4 (eth0): canceled DHCP transaction
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7276] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7276] dhcp4 (eth0): state changed no lease
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7278] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7293] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7297] audit: op="device-reapply" interface="eth1" ifindex=3 pid=59006 uid=0 result="fail" reason="Device is not activated"
Dec 08 19:55:23 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 08 19:55:23 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7692] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7697] dhcp4 (eth0): state changed new lease, address=38.102.83.66
Dec 08 19:55:23 compute-0 kernel: br-ex: entered promiscuous mode
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7709] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7766] device (eth1): disconnecting for new activation request.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7766] audit: op="connection-activate" uuid="707dcd72-95a2-510c-83a0-f85b3c0b91a1" name="ci-private-network" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7767] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 08 19:55:23 compute-0 kernel: vlan22: entered promiscuous mode
Dec 08 19:55:23 compute-0 systemd-udevd[59014]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:55:23 compute-0 kernel: vlan21: entered promiscuous mode
Dec 08 19:55:23 compute-0 systemd-udevd[59012]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7907] device (eth1): Activation: starting connection 'ci-private-network' (707dcd72-95a2-510c-83a0-f85b3c0b91a1)
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7916] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7917] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7919] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7921] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7922] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7923] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7933] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 08 19:55:23 compute-0 kernel: vlan20: entered promiscuous mode
Dec 08 19:55:23 compute-0 systemd-udevd[59103]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7945] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7948] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7954] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7957] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7962] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7965] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7968] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7970] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7973] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7975] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7980] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7983] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7986] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7988] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.7996] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8003] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8007] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59006 uid=0 result="success"
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8015] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8026] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8027] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8030] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8044] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8056] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8068] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8070] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8077] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8083] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8087] device (eth1): Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8091] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8093] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8098] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8102] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8106] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8110] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8119] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8127] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8129] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8134] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8173] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8175] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 08 19:55:23 compute-0 NetworkManager[56229]: <info>  [1765223723.8179] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 08 19:55:24 compute-0 sshd-session[59007]: Invalid user sftpuser from 45.78.228.32 port 35166
Dec 08 19:55:24 compute-0 NetworkManager[56229]: <info>  [1765223724.9507] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.1113] checkpoint[0x5651458c6950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.1116] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 sudo[59340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lomyejkbbnucjmhveweqgptsqjuisoek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223724.7131386-295-53574328400304/AnsiballZ_async_status.py'
Dec 08 19:55:25 compute-0 sudo[59340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.3866] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.3877] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 python3.9[59342]: ansible-ansible.legacy.async_status Invoked with jid=j252538509123.59000 mode=status _async_dir=/root/.ansible_async
Dec 08 19:55:25 compute-0 sudo[59340]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.5685] audit: op="networking-control" arg="global-dns-configuration" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.5716] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.5743] audit: op="networking-control" arg="global-dns-configuration" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.5759] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.7136] checkpoint[0x5651458c6a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 08 19:55:25 compute-0 NetworkManager[56229]: <info>  [1765223725.7140] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=59006 uid=0 result="success"
Dec 08 19:55:25 compute-0 ansible-async_wrapper.py[59004]: Module complete (59004)
Dec 08 19:55:26 compute-0 ansible-async_wrapper.py[59003]: Done in kid B.
Dec 08 19:55:28 compute-0 sudo[59445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbapfeiyguuchavokmnauqqdfhbouybp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223724.7131386-295-53574328400304/AnsiballZ_async_status.py'
Dec 08 19:55:28 compute-0 sudo[59445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:28 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 08 19:55:28 compute-0 python3.9[59447]: ansible-ansible.legacy.async_status Invoked with jid=j252538509123.59000 mode=status _async_dir=/root/.ansible_async
Dec 08 19:55:28 compute-0 sudo[59445]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:29 compute-0 sudo[59548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjtwzwckzmpazjmbdcyheypofuksdtrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223724.7131386-295-53574328400304/AnsiballZ_async_status.py'
Dec 08 19:55:29 compute-0 sudo[59548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:29 compute-0 sshd-session[59007]: Received disconnect from 45.78.228.32 port 35166:11: Bye Bye [preauth]
Dec 08 19:55:29 compute-0 sshd-session[59007]: Disconnected from invalid user sftpuser 45.78.228.32 port 35166 [preauth]
Dec 08 19:55:29 compute-0 python3.9[59550]: ansible-ansible.legacy.async_status Invoked with jid=j252538509123.59000 mode=cleanup _async_dir=/root/.ansible_async
Dec 08 19:55:29 compute-0 sudo[59548]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:30 compute-0 sudo[59700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nabstctbyyyahlxsbxqiuldpqxunvfux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223729.7840624-322-173036662498518/AnsiballZ_stat.py'
Dec 08 19:55:30 compute-0 sudo[59700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:30 compute-0 python3.9[59702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:55:30 compute-0 sudo[59700]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:30 compute-0 sudo[59823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhuighfnjhetgtgdkhcgepmhjpzkghnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223729.7840624-322-173036662498518/AnsiballZ_copy.py'
Dec 08 19:55:30 compute-0 sudo[59823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:30 compute-0 python3.9[59825]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223729.7840624-322-173036662498518/.source.returncode _original_basename=.m01_10mr follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:30 compute-0 sudo[59823]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:31 compute-0 sshd-session[59827]: Invalid user noc from 172.190.42.55 port 34048
Dec 08 19:55:31 compute-0 sshd-session[59827]: Received disconnect from 172.190.42.55 port 34048:11: Bye Bye [preauth]
Dec 08 19:55:31 compute-0 sshd-session[59827]: Disconnected from invalid user noc 172.190.42.55 port 34048 [preauth]
Dec 08 19:55:31 compute-0 sudo[59977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cctyvztwkmhhbvwdnzotoecsgxlwqpbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223731.0973396-338-126975476782963/AnsiballZ_stat.py'
Dec 08 19:55:31 compute-0 sudo[59977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:31 compute-0 python3.9[59979]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:55:31 compute-0 sudo[59977]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:31 compute-0 sudo[60100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvbyqvccurkgsutqijomjazybvlpvtmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223731.0973396-338-126975476782963/AnsiballZ_copy.py'
Dec 08 19:55:31 compute-0 sudo[60100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:32 compute-0 python3.9[60102]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223731.0973396-338-126975476782963/.source.cfg _original_basename=.h6_rgzxz follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:32 compute-0 sudo[60100]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:32 compute-0 sudo[60253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfjnglrysaknkdtcehxguhlbceovshvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223732.2630448-353-21950726632175/AnsiballZ_systemd.py'
Dec 08 19:55:32 compute-0 sudo[60253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:32 compute-0 python3.9[60255]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:55:32 compute-0 systemd[1]: Reloading Network Manager...
Dec 08 19:55:32 compute-0 NetworkManager[56229]: <info>  [1765223732.9184] audit: op="reload" arg="0" pid=60259 uid=0 result="success"
Dec 08 19:55:32 compute-0 NetworkManager[56229]: <info>  [1765223732.9193] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 08 19:55:32 compute-0 systemd[1]: Reloaded Network Manager.
Dec 08 19:55:32 compute-0 sudo[60253]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:33 compute-0 sshd-session[51955]: Connection closed by 192.168.122.30 port 39626
Dec 08 19:55:33 compute-0 sshd-session[51952]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:55:33 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Dec 08 19:55:33 compute-0 systemd[1]: session-11.scope: Consumed 50.971s CPU time.
Dec 08 19:55:33 compute-0 systemd-logind[793]: Session 11 logged out. Waiting for processes to exit.
Dec 08 19:55:33 compute-0 systemd-logind[793]: Removed session 11.
Dec 08 19:55:39 compute-0 sshd-session[60291]: Accepted publickey for zuul from 192.168.122.30 port 38768 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:55:39 compute-0 systemd-logind[793]: New session 12 of user zuul.
Dec 08 19:55:39 compute-0 systemd[1]: Started Session 12 of User zuul.
Dec 08 19:55:39 compute-0 sshd-session[60291]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:55:40 compute-0 python3.9[60444]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:55:41 compute-0 python3.9[60598]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:55:42 compute-0 python3.9[60788]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:55:42 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 08 19:55:43 compute-0 sshd-session[60294]: Connection closed by 192.168.122.30 port 38768
Dec 08 19:55:43 compute-0 sshd-session[60291]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:55:43 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Dec 08 19:55:43 compute-0 systemd[1]: session-12.scope: Consumed 2.365s CPU time.
Dec 08 19:55:43 compute-0 systemd-logind[793]: Session 12 logged out. Waiting for processes to exit.
Dec 08 19:55:43 compute-0 systemd-logind[793]: Removed session 12.
Dec 08 19:55:48 compute-0 sshd-session[60818]: Accepted publickey for zuul from 192.168.122.30 port 34398 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:55:48 compute-0 systemd-logind[793]: New session 13 of user zuul.
Dec 08 19:55:48 compute-0 systemd[1]: Started Session 13 of User zuul.
Dec 08 19:55:48 compute-0 sshd-session[60818]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:55:49 compute-0 python3.9[60971]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:55:50 compute-0 python3.9[61125]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:55:51 compute-0 sudo[61280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjswjnqrtgauqlqakadcfpfnhkjvqkuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223750.954591-40-1739434560272/AnsiballZ_setup.py'
Dec 08 19:55:51 compute-0 sudo[61280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:51 compute-0 python3.9[61282]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:55:51 compute-0 sudo[61280]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:52 compute-0 sudo[61364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewtmtkyuqumfkpenaibwhtltrjwhfnom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223750.954591-40-1739434560272/AnsiballZ_dnf.py'
Dec 08 19:55:52 compute-0 sudo[61364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:52 compute-0 python3.9[61366]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:55:54 compute-0 sudo[61364]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:54 compute-0 sudo[61518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnzommpxygmdrsiutqhikjqkgvzhvkhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223754.4924412-52-266765382633690/AnsiballZ_setup.py'
Dec 08 19:55:54 compute-0 sudo[61518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:55 compute-0 python3.9[61520]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:55:55 compute-0 sudo[61518]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:56 compute-0 sudo[61709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqwvhlgnkrxmvcmefzlwxbcqdyykumag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223755.572135-63-123262443854726/AnsiballZ_file.py'
Dec 08 19:55:56 compute-0 sudo[61709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:56 compute-0 python3.9[61711]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:56 compute-0 sudo[61709]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:56 compute-0 sudo[61861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iontonsuaxyebvjqqjhtdqjivwbtkupi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223756.4358559-71-267998041149756/AnsiballZ_command.py'
Dec 08 19:55:56 compute-0 sudo[61861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:57 compute-0 python3.9[61863]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:55:57 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:55:57 compute-0 sudo[61861]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:57 compute-0 sudo[62025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xptwhufpwvdybvmbrrbwyswughomrqqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223757.3409355-79-252722396887273/AnsiballZ_stat.py'
Dec 08 19:55:57 compute-0 sudo[62025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:57 compute-0 python3.9[62027]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:55:57 compute-0 sudo[62025]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:58 compute-0 sudo[62103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvygtgafbhlokbfcyxcjgagygypmdvvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223757.3409355-79-252722396887273/AnsiballZ_file.py'
Dec 08 19:55:58 compute-0 sudo[62103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:58 compute-0 python3.9[62105]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:55:58 compute-0 sudo[62103]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:58 compute-0 sudo[62255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjdoiphbctfojbdpdkjfycxanhqreqyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223758.632777-91-253304989868092/AnsiballZ_stat.py'
Dec 08 19:55:58 compute-0 sudo[62255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:59 compute-0 python3.9[62257]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:55:59 compute-0 sudo[62255]: pam_unix(sudo:session): session closed for user root
Dec 08 19:55:59 compute-0 sudo[62333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdizdtbjvhntlgxhxhvmvbljmhjftvhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223758.632777-91-253304989868092/AnsiballZ_file.py'
Dec 08 19:55:59 compute-0 sudo[62333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:55:59 compute-0 python3.9[62335]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:55:59 compute-0 sudo[62333]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:00 compute-0 sudo[62485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nizjdqdnorprnqcqvhljqkyfjrfaplyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223759.9370675-104-63796940260343/AnsiballZ_ini_file.py'
Dec 08 19:56:00 compute-0 sudo[62485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:00 compute-0 python3.9[62487]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:00 compute-0 sudo[62485]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:01 compute-0 sudo[62637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwwpyhxcejuxxjgcgzudcuvqmduffmnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223761.009116-104-146192398903588/AnsiballZ_ini_file.py'
Dec 08 19:56:01 compute-0 sudo[62637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:01 compute-0 python3.9[62639]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:01 compute-0 sudo[62637]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:01 compute-0 sudo[62789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knvgfauookcgycdfqtixtdldsrgvabfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223761.61216-104-87251890267258/AnsiballZ_ini_file.py'
Dec 08 19:56:01 compute-0 sudo[62789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:02 compute-0 python3.9[62791]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:02 compute-0 sudo[62789]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:02 compute-0 sudo[62941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkvpufyjzvgpnuubawpelmsupvtyydmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223762.2971327-104-221499192328134/AnsiballZ_ini_file.py'
Dec 08 19:56:02 compute-0 sudo[62941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:02 compute-0 python3.9[62943]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:02 compute-0 sudo[62941]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:03 compute-0 sudo[63093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvimtkybxqwrrfxofavvwntaargchjac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223763.054492-135-208283680593228/AnsiballZ_dnf.py'
Dec 08 19:56:03 compute-0 sudo[63093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:03 compute-0 python3.9[63095]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:56:04 compute-0 sudo[63093]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:05 compute-0 sudo[63247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzpjyibdttnkgxcdwkbhqkrvoiqymfwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223765.1908987-146-180718186647450/AnsiballZ_setup.py'
Dec 08 19:56:05 compute-0 sudo[63247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:05 compute-0 python3.9[63249]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:56:05 compute-0 sudo[63247]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:06 compute-0 sudo[63401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpjicneiijielxiqejdlplhhnzkzdmhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223766.053644-154-28250007326702/AnsiballZ_stat.py'
Dec 08 19:56:06 compute-0 sudo[63401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:06 compute-0 python3.9[63403]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:56:06 compute-0 sudo[63401]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:07 compute-0 sudo[63553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzrmazjiybxfulddeoybwlahqmiwzbiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223766.768562-163-86127832875224/AnsiballZ_stat.py'
Dec 08 19:56:07 compute-0 sudo[63553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:07 compute-0 python3.9[63555]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:56:07 compute-0 sudo[63553]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:07 compute-0 sudo[63705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vaypxmedhccgmxvoclziksyrlugsquqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223767.5729356-173-30254198517344/AnsiballZ_command.py'
Dec 08 19:56:07 compute-0 sudo[63705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:08 compute-0 python3.9[63707]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:56:08 compute-0 sudo[63705]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:08 compute-0 sudo[63858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqogkvbqafwcoupuuleezywyeypmiczb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223768.340202-183-270571503769098/AnsiballZ_service_facts.py'
Dec 08 19:56:08 compute-0 sudo[63858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:09 compute-0 python3.9[63860]: ansible-service_facts Invoked
Dec 08 19:56:09 compute-0 network[63877]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 19:56:09 compute-0 network[63878]: 'network-scripts' will be removed from distribution in near future.
Dec 08 19:56:09 compute-0 network[63879]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 19:56:14 compute-0 sudo[63858]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:15 compute-0 sudo[64162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntgqqjnqrptgquvqozgjompldtnpgprq ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1765223774.7164533-198-132762422856000/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1765223774.7164533-198-132762422856000/args'
Dec 08 19:56:15 compute-0 sudo[64162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:15 compute-0 sudo[64162]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:16 compute-0 sudo[64329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udcyczmvhbjhyvpdjwqnrbqvsqfsloqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223775.7946632-209-146840916390349/AnsiballZ_dnf.py'
Dec 08 19:56:16 compute-0 sudo[64329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:16 compute-0 python3.9[64331]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:56:17 compute-0 sudo[64329]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:18 compute-0 sudo[64482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqqvmgrzxczldfqyugkrzuddkregunnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223777.8797903-222-108372777272088/AnsiballZ_package_facts.py'
Dec 08 19:56:18 compute-0 sudo[64482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:19 compute-0 python3.9[64484]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 08 19:56:19 compute-0 sudo[64482]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:20 compute-0 sudo[64634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvnbzomcnwqezuveyjqrtnralbvocirf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223779.6667278-232-150647555679907/AnsiballZ_stat.py'
Dec 08 19:56:20 compute-0 sudo[64634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:20 compute-0 python3.9[64636]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:20 compute-0 sudo[64634]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:20 compute-0 sudo[64761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgangzuyjdjouhayzpdvnqyletqsthig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223779.6667278-232-150647555679907/AnsiballZ_copy.py'
Dec 08 19:56:20 compute-0 sudo[64761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:20 compute-0 sshd-session[64686]: Invalid user ubuntu from 159.223.8.81 port 52952
Dec 08 19:56:20 compute-0 sshd-session[64686]: Received disconnect from 159.223.8.81 port 52952:11: Bye Bye [preauth]
Dec 08 19:56:20 compute-0 sshd-session[64686]: Disconnected from invalid user ubuntu 159.223.8.81 port 52952 [preauth]
Dec 08 19:56:21 compute-0 python3.9[64763]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223779.6667278-232-150647555679907/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:21 compute-0 sudo[64761]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:21 compute-0 sudo[64917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfzczghmdtbypukfrfieegsjrqlmojnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223781.301544-247-147489979479200/AnsiballZ_stat.py'
Dec 08 19:56:21 compute-0 sudo[64917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:21 compute-0 python3.9[64919]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:21 compute-0 sudo[64917]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:22 compute-0 sshd-session[64764]: Invalid user cheeki from 222.172.32.246 port 2174
Dec 08 19:56:22 compute-0 sudo[65042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgzzibviknadnmremthrhabeavgbaohq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223781.301544-247-147489979479200/AnsiballZ_copy.py'
Dec 08 19:56:22 compute-0 sudo[65042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:22 compute-0 sshd-session[64764]: Received disconnect from 222.172.32.246 port 2174:11: Bye Bye [preauth]
Dec 08 19:56:22 compute-0 sshd-session[64764]: Disconnected from invalid user cheeki 222.172.32.246 port 2174 [preauth]
Dec 08 19:56:22 compute-0 python3.9[65044]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223781.301544-247-147489979479200/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:22 compute-0 sudo[65042]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:23 compute-0 sudo[65196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuzptgasydymdzcwwcthtntrommmszzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223783.1287444-268-93541074999398/AnsiballZ_lineinfile.py'
Dec 08 19:56:23 compute-0 sudo[65196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:23 compute-0 python3.9[65198]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:23 compute-0 sudo[65196]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:24 compute-0 sudo[65350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nyqsiypauavlyjmwzvycoysrpxvvptoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223784.387279-283-246846566328278/AnsiballZ_setup.py'
Dec 08 19:56:24 compute-0 sudo[65350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:24 compute-0 python3.9[65352]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:56:25 compute-0 sudo[65350]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:26 compute-0 sudo[65434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhcgdxzpyobqhguruuzzlqnoqguyajbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223784.387279-283-246846566328278/AnsiballZ_systemd.py'
Dec 08 19:56:26 compute-0 sudo[65434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:26 compute-0 python3.9[65436]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:56:26 compute-0 sudo[65434]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:27 compute-0 sudo[65588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zayoctcqmzkakpagulnccifbsvrjnmgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223786.9821687-299-52900655310864/AnsiballZ_setup.py'
Dec 08 19:56:27 compute-0 sudo[65588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:27 compute-0 python3.9[65590]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:56:27 compute-0 sudo[65588]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:28 compute-0 sudo[65672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyfmnwttlcpmnuuoqgqcldlxkpsqrxot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223786.9821687-299-52900655310864/AnsiballZ_systemd.py'
Dec 08 19:56:28 compute-0 sudo[65672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:28 compute-0 python3.9[65674]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:56:28 compute-0 chronyd[792]: chronyd exiting
Dec 08 19:56:28 compute-0 systemd[1]: Stopping NTP client/server...
Dec 08 19:56:28 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Dec 08 19:56:28 compute-0 systemd[1]: Stopped NTP client/server.
Dec 08 19:56:28 compute-0 systemd[1]: Starting NTP client/server...
Dec 08 19:56:28 compute-0 chronyd[65682]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 08 19:56:28 compute-0 chronyd[65682]: Frequency -31.556 +/- 0.286 ppm read from /var/lib/chrony/drift
Dec 08 19:56:28 compute-0 chronyd[65682]: Loaded seccomp filter (level 2)
Dec 08 19:56:28 compute-0 systemd[1]: Started NTP client/server.
Dec 08 19:56:28 compute-0 sudo[65672]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:29 compute-0 sshd-session[60821]: Connection closed by 192.168.122.30 port 34398
Dec 08 19:56:29 compute-0 sshd-session[60818]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:56:29 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Dec 08 19:56:29 compute-0 systemd[1]: session-13.scope: Consumed 26.816s CPU time.
Dec 08 19:56:29 compute-0 systemd-logind[793]: Session 13 logged out. Waiting for processes to exit.
Dec 08 19:56:29 compute-0 systemd-logind[793]: Removed session 13.
Dec 08 19:56:35 compute-0 sshd-session[65709]: Accepted publickey for zuul from 192.168.122.30 port 36716 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:56:35 compute-0 systemd-logind[793]: New session 14 of user zuul.
Dec 08 19:56:35 compute-0 systemd[1]: Started Session 14 of User zuul.
Dec 08 19:56:35 compute-0 sshd-session[65709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:56:36 compute-0 python3.9[65862]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:56:37 compute-0 sudo[66016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgwkgdbnlkvjankjpsfzzengywvhleye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223796.8181064-33-57816236823130/AnsiballZ_file.py'
Dec 08 19:56:37 compute-0 sudo[66016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:37 compute-0 python3.9[66018]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:37 compute-0 sudo[66016]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:38 compute-0 sudo[66191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btemgvcergfqljdfpshxqvkiigkodprp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223797.7350771-41-279068496469351/AnsiballZ_stat.py'
Dec 08 19:56:38 compute-0 sudo[66191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:38 compute-0 python3.9[66193]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:38 compute-0 sudo[66191]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:38 compute-0 sudo[66269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veffantakgdqkaugerudluvghbzxsaoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223797.7350771-41-279068496469351/AnsiballZ_file.py'
Dec 08 19:56:38 compute-0 sudo[66269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:39 compute-0 python3.9[66271]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.qszqhxpj recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:39 compute-0 sudo[66269]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:39 compute-0 sudo[66421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzdhettkpkmtbiiqejrqnwtanwwelyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223799.4284277-61-110036273131608/AnsiballZ_stat.py'
Dec 08 19:56:39 compute-0 sudo[66421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:39 compute-0 python3.9[66423]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:39 compute-0 sudo[66421]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:40 compute-0 sudo[66544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxdzlwphplzpbxacvhyklngqgqhzyetm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223799.4284277-61-110036273131608/AnsiballZ_copy.py'
Dec 08 19:56:40 compute-0 sudo[66544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:40 compute-0 python3.9[66546]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223799.4284277-61-110036273131608/.source _original_basename=.db35wvl7 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:40 compute-0 sudo[66544]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:41 compute-0 sudo[66696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atmxnyusilwgfjapdfymymmkcownkkmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223800.8423364-77-32234479213351/AnsiballZ_file.py'
Dec 08 19:56:41 compute-0 sudo[66696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:41 compute-0 python3.9[66698]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:41 compute-0 sudo[66696]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:41 compute-0 sudo[66848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlkfxtqnrvhlkwheunricpgktlhktslf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223801.5467188-85-63524313108503/AnsiballZ_stat.py'
Dec 08 19:56:41 compute-0 sudo[66848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:42 compute-0 python3.9[66850]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:42 compute-0 sudo[66848]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:42 compute-0 sudo[66971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxaxhkznewokozbfpaoagbyqapljtrea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223801.5467188-85-63524313108503/AnsiballZ_copy.py'
Dec 08 19:56:42 compute-0 sudo[66971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:42 compute-0 python3.9[66973]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223801.5467188-85-63524313108503/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:42 compute-0 sudo[66971]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:43 compute-0 sudo[67123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfngrcmkmlalhbivvvajsfghcdhduhfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223802.7559323-85-219562800436935/AnsiballZ_stat.py'
Dec 08 19:56:43 compute-0 sudo[67123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:43 compute-0 python3.9[67125]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:43 compute-0 sudo[67123]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:43 compute-0 sudo[67246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpbwzikxxjsproblhtojmeagcwyhrext ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223802.7559323-85-219562800436935/AnsiballZ_copy.py'
Dec 08 19:56:43 compute-0 sudo[67246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:43 compute-0 python3.9[67248]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223802.7559323-85-219562800436935/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:56:43 compute-0 sudo[67246]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:44 compute-0 sudo[67398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxmhuudtfkswhfzkmkqlruzezbopujsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223803.991307-114-11907276065278/AnsiballZ_file.py'
Dec 08 19:56:44 compute-0 sudo[67398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:44 compute-0 python3.9[67400]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:44 compute-0 sudo[67398]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:44 compute-0 sshd-session[67424]: Received disconnect from 172.190.42.55 port 35112:11: Bye Bye [preauth]
Dec 08 19:56:44 compute-0 sshd-session[67424]: Disconnected from authenticating user root 172.190.42.55 port 35112 [preauth]
Dec 08 19:56:44 compute-0 sudo[67552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvktknxiojpsnrjqmbyzqumzkvwqvtof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223804.671732-122-21572519580381/AnsiballZ_stat.py'
Dec 08 19:56:44 compute-0 sudo[67552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:45 compute-0 python3.9[67554]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:45 compute-0 sudo[67552]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:45 compute-0 sudo[67675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilmykkfeostoodyvniyugoviujcaepny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223804.671732-122-21572519580381/AnsiballZ_copy.py'
Dec 08 19:56:45 compute-0 sudo[67675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:45 compute-0 python3.9[67677]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223804.671732-122-21572519580381/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:45 compute-0 sudo[67675]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:46 compute-0 sudo[67827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omfocbdlvoueilnsjnordimihkqovfkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223806.1141908-137-149843834008047/AnsiballZ_stat.py'
Dec 08 19:56:46 compute-0 sudo[67827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:46 compute-0 python3.9[67829]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:46 compute-0 sudo[67827]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:47 compute-0 sudo[67950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlpppkmrffvlwieeppzrbqqxcafkfrfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223806.1141908-137-149843834008047/AnsiballZ_copy.py'
Dec 08 19:56:47 compute-0 sudo[67950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:47 compute-0 python3.9[67952]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223806.1141908-137-149843834008047/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:47 compute-0 sudo[67950]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:48 compute-0 sudo[68102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kecuqrczblqalfpldhebdgxaoxdwases ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223807.3855677-152-5329137486354/AnsiballZ_systemd.py'
Dec 08 19:56:48 compute-0 sudo[68102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:48 compute-0 python3.9[68104]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:56:48 compute-0 systemd[1]: Reloading.
Dec 08 19:56:48 compute-0 systemd-rc-local-generator[68127]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:56:48 compute-0 systemd-sysv-generator[68136]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:56:48 compute-0 systemd[1]: Reloading.
Dec 08 19:56:48 compute-0 systemd-sysv-generator[68174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:56:48 compute-0 systemd-rc-local-generator[68170]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:56:48 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Dec 08 19:56:48 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Dec 08 19:56:48 compute-0 sudo[68102]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:49 compute-0 sudo[68332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psiawugxeduevfasufdyzfcarkolprvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223809.0545862-160-25210676660952/AnsiballZ_stat.py'
Dec 08 19:56:49 compute-0 sudo[68332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:49 compute-0 python3.9[68334]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:49 compute-0 sudo[68332]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:49 compute-0 sudo[68455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btdjdmaprjzcycexcbewfiehvigfuphi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223809.0545862-160-25210676660952/AnsiballZ_copy.py'
Dec 08 19:56:49 compute-0 sudo[68455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:50 compute-0 python3.9[68457]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223809.0545862-160-25210676660952/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:50 compute-0 sudo[68455]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:50 compute-0 sudo[68607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnuhnfvdonkfjdbisxpjugmcoqjtjdrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223810.3015904-175-136955193723518/AnsiballZ_stat.py'
Dec 08 19:56:50 compute-0 sudo[68607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:50 compute-0 python3.9[68609]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:56:50 compute-0 sudo[68607]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:51 compute-0 sudo[68730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhcdpwlfrtkpqnhiyltryuynmtbqumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223810.3015904-175-136955193723518/AnsiballZ_copy.py'
Dec 08 19:56:51 compute-0 sudo[68730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:51 compute-0 python3.9[68732]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223810.3015904-175-136955193723518/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:56:51 compute-0 sudo[68730]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:51 compute-0 sudo[68882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfgbywbbnhzmgtzqaqdqlzhybjynlyga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223811.5326116-190-44685109371754/AnsiballZ_systemd.py'
Dec 08 19:56:51 compute-0 sudo[68882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:52 compute-0 python3.9[68884]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:56:52 compute-0 systemd[1]: Reloading.
Dec 08 19:56:52 compute-0 systemd-rc-local-generator[68912]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:56:52 compute-0 systemd-sysv-generator[68915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:56:52 compute-0 systemd[1]: Reloading.
Dec 08 19:56:52 compute-0 systemd-rc-local-generator[68947]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:56:52 compute-0 systemd-sysv-generator[68952]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:56:52 compute-0 systemd[1]: Starting Create netns directory...
Dec 08 19:56:52 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 08 19:56:52 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 08 19:56:52 compute-0 systemd[1]: Finished Create netns directory.
Dec 08 19:56:52 compute-0 sudo[68882]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:53 compute-0 python3.9[69110]: ansible-ansible.builtin.service_facts Invoked
Dec 08 19:56:53 compute-0 network[69127]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 19:56:53 compute-0 network[69128]: 'network-scripts' will be removed from distribution in near future.
Dec 08 19:56:53 compute-0 network[69129]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 19:56:57 compute-0 sudo[69389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wgcautftcpgmrrvkeswfgungobltyedy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223817.328583-206-123347788815204/AnsiballZ_systemd.py'
Dec 08 19:56:57 compute-0 sudo[69389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:57 compute-0 python3.9[69391]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:56:57 compute-0 systemd[1]: Reloading.
Dec 08 19:56:58 compute-0 systemd-rc-local-generator[69418]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:56:58 compute-0 systemd-sysv-generator[69422]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:56:58 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 08 19:56:58 compute-0 iptables.init[69431]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 08 19:56:58 compute-0 iptables.init[69431]: iptables: Flushing firewall rules: [  OK  ]
Dec 08 19:56:58 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Dec 08 19:56:58 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 08 19:56:58 compute-0 sudo[69389]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:58 compute-0 sudo[69625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmhwrtvxgymbjuaqnbakzohusdiypfci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223818.6549993-206-233919392779353/AnsiballZ_systemd.py'
Dec 08 19:56:58 compute-0 sudo[69625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:56:59 compute-0 python3.9[69627]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:56:59 compute-0 sudo[69625]: pam_unix(sudo:session): session closed for user root
Dec 08 19:56:59 compute-0 sudo[69779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfkjgmmsiuagswstxmzyysbfefvbjebh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223819.5535228-222-45330061872963/AnsiballZ_systemd.py'
Dec 08 19:56:59 compute-0 sudo[69779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:00 compute-0 python3.9[69781]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:57:00 compute-0 systemd[1]: Reloading.
Dec 08 19:57:00 compute-0 systemd-sysv-generator[69814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:57:00 compute-0 systemd-rc-local-generator[69811]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:57:00 compute-0 systemd[1]: Starting Netfilter Tables...
Dec 08 19:57:00 compute-0 systemd[1]: Finished Netfilter Tables.
Dec 08 19:57:00 compute-0 sudo[69779]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:01 compute-0 sudo[69971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdbmyvlfmkkmeycgamtaikzdcejgfmql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223820.7938373-230-271452600905140/AnsiballZ_command.py'
Dec 08 19:57:01 compute-0 sudo[69971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:01 compute-0 python3.9[69973]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:01 compute-0 sudo[69971]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:02 compute-0 sudo[70124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrpkfgtkjhbehhwudrqjaujwadkmyjle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223821.8725042-244-151780161106661/AnsiballZ_stat.py'
Dec 08 19:57:02 compute-0 sudo[70124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:02 compute-0 python3.9[70126]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:02 compute-0 sudo[70124]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:02 compute-0 sudo[70249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrhlccnmxrzwdvottxezswmxhtdnalfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223821.8725042-244-151780161106661/AnsiballZ_copy.py'
Dec 08 19:57:02 compute-0 sudo[70249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:02 compute-0 python3.9[70251]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223821.8725042-244-151780161106661/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:02 compute-0 sudo[70249]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:03 compute-0 sudo[70402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwqbwkicxpjlcomwthvflnweduayupyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223823.120107-259-134596908511975/AnsiballZ_systemd.py'
Dec 08 19:57:03 compute-0 sudo[70402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:03 compute-0 python3.9[70404]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:57:03 compute-0 systemd[1]: Reloading OpenSSH server daemon...
Dec 08 19:57:03 compute-0 systemd[1]: Reloaded OpenSSH server daemon.
Dec 08 19:57:03 compute-0 sshd[1006]: Received SIGHUP; restarting.
Dec 08 19:57:03 compute-0 sshd[1006]: Server listening on 0.0.0.0 port 22.
Dec 08 19:57:03 compute-0 sshd[1006]: Server listening on :: port 22.
Dec 08 19:57:03 compute-0 sudo[70402]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:04 compute-0 sudo[70558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feqzdlefckcnxvvildqdogzkfferbahb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223823.9746387-267-273223962254740/AnsiballZ_file.py'
Dec 08 19:57:04 compute-0 sudo[70558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:04 compute-0 python3.9[70560]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:04 compute-0 sudo[70558]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:04 compute-0 sudo[70710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pkqvioaitcumqqmxystvyavxigrmyqgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223824.6780682-275-86660800341324/AnsiballZ_stat.py'
Dec 08 19:57:04 compute-0 sudo[70710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:05 compute-0 python3.9[70712]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:05 compute-0 sudo[70710]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:05 compute-0 sudo[70833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lylrmdstzmxfhmtzobngmddhxlogwtkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223824.6780682-275-86660800341324/AnsiballZ_copy.py'
Dec 08 19:57:05 compute-0 sudo[70833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:05 compute-0 python3.9[70835]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223824.6780682-275-86660800341324/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:05 compute-0 sudo[70833]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:06 compute-0 sudo[70985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdtjwpglzacgxumnaglpsmscpiownbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223826.0043638-293-229351715792150/AnsiballZ_timezone.py'
Dec 08 19:57:06 compute-0 sudo[70985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:06 compute-0 python3.9[70987]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 08 19:57:06 compute-0 systemd[1]: Starting Time & Date Service...
Dec 08 19:57:06 compute-0 systemd[1]: Started Time & Date Service.
Dec 08 19:57:06 compute-0 sudo[70985]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:07 compute-0 sudo[71141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsezgierjpcesqcubncumjvidggxario ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223827.0357678-302-52190942033231/AnsiballZ_file.py'
Dec 08 19:57:07 compute-0 sudo[71141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:07 compute-0 python3.9[71143]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:07 compute-0 sudo[71141]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:07 compute-0 sudo[71293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kjzltrohxtswqsimfktvgbqloojbvdpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223827.6437979-310-226099790084621/AnsiballZ_stat.py'
Dec 08 19:57:07 compute-0 sudo[71293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:08 compute-0 python3.9[71295]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:08 compute-0 sudo[71293]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:08 compute-0 sudo[71416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usbjwimcrxlcssoeboayubrczzubtkml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223827.6437979-310-226099790084621/AnsiballZ_copy.py'
Dec 08 19:57:08 compute-0 sudo[71416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:08 compute-0 python3.9[71418]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223827.6437979-310-226099790084621/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:08 compute-0 sudo[71416]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:09 compute-0 sudo[71568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooigbtjkuvigqodmsqgrdydnmvksyjxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223828.8476963-325-278698870159007/AnsiballZ_stat.py'
Dec 08 19:57:09 compute-0 sudo[71568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:09 compute-0 python3.9[71570]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:09 compute-0 sudo[71568]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:09 compute-0 sudo[71691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzlwnwagrsxrtculbjnfnpqnunjyiciv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223828.8476963-325-278698870159007/AnsiballZ_copy.py'
Dec 08 19:57:09 compute-0 sudo[71691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:09 compute-0 python3.9[71693]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223828.8476963-325-278698870159007/.source.yaml _original_basename=.hjqzdnrx follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:09 compute-0 sudo[71691]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:10 compute-0 sudo[71843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tamafcmgmsrfhkokoflpndblvcqrbnzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223830.1148326-340-107953709863934/AnsiballZ_stat.py'
Dec 08 19:57:10 compute-0 sudo[71843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:10 compute-0 python3.9[71845]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:10 compute-0 sudo[71843]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:10 compute-0 sudo[71966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plresyzjyizqhpavfpnmsbqrgvyoqgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223830.1148326-340-107953709863934/AnsiballZ_copy.py'
Dec 08 19:57:10 compute-0 sudo[71966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:11 compute-0 python3.9[71968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223830.1148326-340-107953709863934/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:11 compute-0 sudo[71966]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:11 compute-0 sudo[72118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqhzqcfwmgerdxxhffmifpjvfitnijgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223831.3955202-355-107584242450498/AnsiballZ_command.py'
Dec 08 19:57:11 compute-0 sudo[72118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:11 compute-0 python3.9[72120]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:11 compute-0 sudo[72118]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:12 compute-0 sudo[72271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srloulgyczyttijnjuvixtjasjunpnon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223832.069094-363-105260479226624/AnsiballZ_command.py'
Dec 08 19:57:12 compute-0 sudo[72271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:12 compute-0 python3.9[72273]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:12 compute-0 sudo[72271]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:13 compute-0 sudo[72424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnqlsyhanopxzbxbcrcornqtvktljhga ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765223832.7932868-371-228959842868685/AnsiballZ_edpm_nftables_from_files.py'
Dec 08 19:57:13 compute-0 sudo[72424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:13 compute-0 python3[72426]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 08 19:57:13 compute-0 sudo[72424]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:14 compute-0 sudo[72576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryoetauulzxijionusynijjlxwafxhxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223833.6892333-379-151524507774051/AnsiballZ_stat.py'
Dec 08 19:57:14 compute-0 sudo[72576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:14 compute-0 python3.9[72578]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:14 compute-0 sudo[72576]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:14 compute-0 sudo[72699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arghjmhcssiamxgddsrqcwgztwbzcoii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223833.6892333-379-151524507774051/AnsiballZ_copy.py'
Dec 08 19:57:14 compute-0 sudo[72699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:15 compute-0 python3.9[72701]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223833.6892333-379-151524507774051/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:15 compute-0 sudo[72699]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:15 compute-0 sudo[72851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrfgoxznhrxergaxqfuymufxlfzkkocr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223835.2734597-394-116155298934205/AnsiballZ_stat.py'
Dec 08 19:57:15 compute-0 sudo[72851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:15 compute-0 python3.9[72853]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:15 compute-0 sudo[72851]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:16 compute-0 sudo[72974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blajajlfxnjmjelpsbetkpmyopufltgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223835.2734597-394-116155298934205/AnsiballZ_copy.py'
Dec 08 19:57:16 compute-0 sudo[72974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:16 compute-0 python3.9[72976]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223835.2734597-394-116155298934205/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:16 compute-0 sudo[72974]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:16 compute-0 sudo[73126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oojbjvfhvfuurpipbjyhsvctyepxxwdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223836.53255-409-98979478768124/AnsiballZ_stat.py'
Dec 08 19:57:16 compute-0 sudo[73126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:16 compute-0 python3.9[73128]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:17 compute-0 sudo[73126]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:17 compute-0 sudo[73249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yziqlnrhqiuspzizwnglhycgoqbeehhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223836.53255-409-98979478768124/AnsiballZ_copy.py'
Dec 08 19:57:17 compute-0 sudo[73249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:17 compute-0 python3.9[73251]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223836.53255-409-98979478768124/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:17 compute-0 sudo[73249]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:18 compute-0 sudo[73401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywhrtlyprvmjqfndvxrduvnzpxkaqagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223837.8329937-424-215138138668614/AnsiballZ_stat.py'
Dec 08 19:57:18 compute-0 sudo[73401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:18 compute-0 python3.9[73403]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:18 compute-0 sudo[73401]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:18 compute-0 sudo[73524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzdmjypuhsdzfgyijmuawafojuyvikfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223837.8329937-424-215138138668614/AnsiballZ_copy.py'
Dec 08 19:57:18 compute-0 sudo[73524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:18 compute-0 python3.9[73526]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223837.8329937-424-215138138668614/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:18 compute-0 sudo[73524]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:19 compute-0 sudo[73676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brltdelohlbdqgiapkhcecqlsrorwzwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223839.0623295-439-162932178319891/AnsiballZ_stat.py'
Dec 08 19:57:19 compute-0 sudo[73676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:19 compute-0 python3.9[73678]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:57:19 compute-0 sudo[73676]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:19 compute-0 sudo[73801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcwabdybbgqyerbltdjbflkeuxgfidfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223839.0623295-439-162932178319891/AnsiballZ_copy.py'
Dec 08 19:57:19 compute-0 sudo[73801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:20 compute-0 python3.9[73803]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223839.0623295-439-162932178319891/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:20 compute-0 sudo[73801]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:20 compute-0 sshd-session[73706]: Received disconnect from 159.223.8.81 port 36120:11: Bye Bye [preauth]
Dec 08 19:57:20 compute-0 sshd-session[73706]: Disconnected from authenticating user root 159.223.8.81 port 36120 [preauth]
Dec 08 19:57:20 compute-0 sudo[73953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oouwsriwkzrncrlbnndszvuikdvyqxeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223840.3469844-454-274112056354437/AnsiballZ_file.py'
Dec 08 19:57:20 compute-0 sudo[73953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:20 compute-0 python3.9[73955]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:20 compute-0 sudo[73953]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:21 compute-0 sudo[74105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkmimwkjcdiedblbpmpxlukjyvrolhxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223841.001709-462-275553160393271/AnsiballZ_command.py'
Dec 08 19:57:21 compute-0 sudo[74105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:21 compute-0 python3.9[74107]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:21 compute-0 sudo[74105]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:22 compute-0 sudo[74264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eicvoocmoyjqagtusfnkjmsjibrzveui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223841.9205506-470-90196775673324/AnsiballZ_blockinfile.py'
Dec 08 19:57:22 compute-0 sudo[74264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:22 compute-0 python3.9[74266]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:22 compute-0 sudo[74264]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:23 compute-0 sudo[74417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oexyxqkntyfbquflukymdovrbovmtfzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223842.9142544-479-171221165047482/AnsiballZ_file.py'
Dec 08 19:57:23 compute-0 sudo[74417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:23 compute-0 python3.9[74419]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:23 compute-0 sudo[74417]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:23 compute-0 sudo[74569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-musmvhblticknfjqkhjfgumfsmmadjai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223843.5894275-479-75315011382730/AnsiballZ_file.py'
Dec 08 19:57:23 compute-0 sudo[74569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:24 compute-0 python3.9[74571]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:24 compute-0 sudo[74569]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:25 compute-0 sudo[74721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxdbjbebhybbuuraxmwkjxqwhuexdrhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223844.22306-494-182472046423835/AnsiballZ_mount.py'
Dec 08 19:57:25 compute-0 sudo[74721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:25 compute-0 python3.9[74723]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 08 19:57:25 compute-0 sudo[74721]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:25 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 19:57:25 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 19:57:25 compute-0 sudo[74875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkbaaiynplmrzgjbqlpbjigorqsmsxby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223845.4171605-494-100049584478106/AnsiballZ_mount.py'
Dec 08 19:57:25 compute-0 sudo[74875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:25 compute-0 python3.9[74877]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 08 19:57:25 compute-0 sudo[74875]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:26 compute-0 sshd-session[65712]: Connection closed by 192.168.122.30 port 36716
Dec 08 19:57:26 compute-0 sshd-session[65709]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:57:26 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Dec 08 19:57:26 compute-0 systemd[1]: session-14.scope: Consumed 36.491s CPU time.
Dec 08 19:57:26 compute-0 systemd-logind[793]: Session 14 logged out. Waiting for processes to exit.
Dec 08 19:57:26 compute-0 systemd-logind[793]: Removed session 14.
Dec 08 19:57:31 compute-0 sshd-session[74903]: Accepted publickey for zuul from 192.168.122.30 port 39846 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:57:31 compute-0 systemd-logind[793]: New session 15 of user zuul.
Dec 08 19:57:31 compute-0 systemd[1]: Started Session 15 of User zuul.
Dec 08 19:57:31 compute-0 sshd-session[74903]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:57:32 compute-0 sudo[75056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhmpgmmmhnpdaljvapsxaftrkuikahhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223851.7127225-16-237890996321740/AnsiballZ_tempfile.py'
Dec 08 19:57:32 compute-0 sudo[75056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:32 compute-0 python3.9[75058]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 08 19:57:32 compute-0 sudo[75056]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:33 compute-0 sudo[75208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haoixgpcablohqzzrmdfujgxmjwdvcsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223852.754344-28-117139016285713/AnsiballZ_stat.py'
Dec 08 19:57:33 compute-0 sudo[75208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:33 compute-0 python3.9[75210]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:57:33 compute-0 sudo[75208]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:34 compute-0 sudo[75360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-millpjjiymdbmthxzexrgccbemxoocir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223853.6402543-38-101462944283312/AnsiballZ_setup.py'
Dec 08 19:57:34 compute-0 sudo[75360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:34 compute-0 python3.9[75362]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:57:34 compute-0 sudo[75360]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:35 compute-0 sudo[75512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpwvniguyafpggjewfthvznmocqlmjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223854.743834-47-27246145308605/AnsiballZ_blockinfile.py'
Dec 08 19:57:35 compute-0 sudo[75512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:35 compute-0 python3.9[75514]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC2oKcCSyijmTj0ezHUO4NUiOL9A3EnDFOUniYUXv5YpvdA20u+zCLEZbf470aAUzejqq7tY8JSvLjFthADKt6UVUhCYg8lzB86nFqXzzPO8rh+ewftDIyoC2dp6aIQTy3YYYZLN7PZYctwwGHO0JCe1r1GmhraRJPUmpOUibwzUuGxC/jMxqWbKZVHnm+uJEU3Vzzp19/aLbdceuMYUFcFsAC+Qv6uVKiX/5gjzqcWkxIE9Jn4Ih5t0RllfFjHGN4ecB0pb8Q7pVpfaVlIi6hcuZ2Dmhui4fl14w5Iv2sr54aIyj89Pu4jUHIMXWoUupBK7x/0vOEqRByvMAVeaehBUUV1qQZBfA7xjQ0I+sRIhRIp0Je0m1YuU6crhf0y9vGgUZAThp5VvaR5JKuQAFq/dAJVp5gR1xpZ2WhZotjS2RsyDVC/G9ya8HTS7ug0GoO7FmC+GhILm49wLYetMu0LWEsNJG+xLWuc1x6jiQkdwNllWcDxY5AioE2PdTfg1D8=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILxenUrPbvmxOIwGvEHZPkwKlr8LpwgX2odyK4rBup3Z
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDyrGg/mBXJgd5sOYWKfDLrvDdxodvYEzH77LWwZ6ByrjxrbUkExBG+K4ZRVl6tRcBdyum4aARNgOqVFVqGuHIs=
                                             create=True mode=0644 path=/tmp/ansible.ow01gria state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:35 compute-0 sudo[75512]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:36 compute-0 sudo[75664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voeswlmjdwbxoxuedyarpdyimpiwaiqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223855.529893-55-179377807842535/AnsiballZ_command.py'
Dec 08 19:57:36 compute-0 sudo[75664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:36 compute-0 python3.9[75666]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ow01gria' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:36 compute-0 sudo[75664]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:36 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 08 19:57:36 compute-0 sudo[75820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-niwrqolhxzbytegwjskpsumabfdaodzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223856.3918626-63-126557522017015/AnsiballZ_file.py'
Dec 08 19:57:36 compute-0 sudo[75820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:37 compute-0 python3.9[75822]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ow01gria state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:37 compute-0 sudo[75820]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:37 compute-0 sshd-session[74906]: Connection closed by 192.168.122.30 port 39846
Dec 08 19:57:37 compute-0 sshd-session[74903]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:57:37 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Dec 08 19:57:37 compute-0 systemd[1]: session-15.scope: Consumed 3.417s CPU time.
Dec 08 19:57:37 compute-0 systemd-logind[793]: Session 15 logged out. Waiting for processes to exit.
Dec 08 19:57:37 compute-0 systemd-logind[793]: Removed session 15.
Dec 08 19:57:43 compute-0 sshd-session[75849]: Accepted publickey for zuul from 192.168.122.30 port 37856 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:57:43 compute-0 systemd-logind[793]: New session 16 of user zuul.
Dec 08 19:57:43 compute-0 systemd[1]: Started Session 16 of User zuul.
Dec 08 19:57:43 compute-0 sshd-session[75849]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:57:44 compute-0 python3.9[76002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:57:45 compute-0 sudo[76156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbqfsqsaoxmsaveccwsabddtyaapuerg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223864.5429835-32-105007792398160/AnsiballZ_systemd.py'
Dec 08 19:57:45 compute-0 sudo[76156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:45 compute-0 python3.9[76158]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 08 19:57:45 compute-0 sudo[76156]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:46 compute-0 sudo[76310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaiaccnzajvosoyacqdavxuunoaximaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223865.7622247-40-125326785482971/AnsiballZ_systemd.py'
Dec 08 19:57:46 compute-0 sudo[76310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:46 compute-0 python3.9[76312]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 19:57:46 compute-0 sudo[76310]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:47 compute-0 sudo[76463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyjvtqhcepgfadytrljwamckamzxtbrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223866.5831118-49-138798830373885/AnsiballZ_command.py'
Dec 08 19:57:47 compute-0 sudo[76463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:47 compute-0 python3.9[76465]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:47 compute-0 sudo[76463]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:47 compute-0 sudo[76616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qiegwwitvvjllrosypenysscakinjwbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223867.4809372-57-147837662465236/AnsiballZ_stat.py'
Dec 08 19:57:47 compute-0 sudo[76616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:48 compute-0 sshd-session[75847]: Received disconnect from 45.78.228.32 port 39728:11: Bye Bye [preauth]
Dec 08 19:57:48 compute-0 sshd-session[75847]: Disconnected from authenticating user root 45.78.228.32 port 39728 [preauth]
Dec 08 19:57:48 compute-0 python3.9[76618]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:57:48 compute-0 sudo[76616]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:48 compute-0 sudo[76770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amnjdrxfmiopwrxdfnrlmeqpphychzpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223868.3262854-65-177304066375289/AnsiballZ_command.py'
Dec 08 19:57:48 compute-0 sudo[76770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:48 compute-0 python3.9[76772]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:57:48 compute-0 sudo[76770]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:49 compute-0 sudo[76925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anfhkgbahvxmxusibysogzdhyjcecysj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223868.9701097-73-8050705226086/AnsiballZ_file.py'
Dec 08 19:57:49 compute-0 sudo[76925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:49 compute-0 python3.9[76927]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:57:49 compute-0 sudo[76925]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:50 compute-0 sshd-session[75852]: Connection closed by 192.168.122.30 port 37856
Dec 08 19:57:50 compute-0 sshd-session[75849]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:57:50 compute-0 systemd-logind[793]: Session 16 logged out. Waiting for processes to exit.
Dec 08 19:57:50 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Dec 08 19:57:50 compute-0 systemd[1]: session-16.scope: Consumed 4.570s CPU time.
Dec 08 19:57:50 compute-0 systemd-logind[793]: Removed session 16.
Dec 08 19:57:55 compute-0 sshd-session[76952]: Accepted publickey for zuul from 192.168.122.30 port 35304 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:57:55 compute-0 systemd-logind[793]: New session 17 of user zuul.
Dec 08 19:57:55 compute-0 systemd[1]: Started Session 17 of User zuul.
Dec 08 19:57:55 compute-0 sshd-session[76952]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:57:56 compute-0 python3.9[77105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:57:57 compute-0 sudo[77259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arazjgttfikpkudfjnufxyffxcjwnedc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223877.3638184-34-65140372860751/AnsiballZ_setup.py'
Dec 08 19:57:57 compute-0 sudo[77259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:58 compute-0 python3.9[77261]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:57:58 compute-0 sudo[77259]: pam_unix(sudo:session): session closed for user root
Dec 08 19:57:58 compute-0 sudo[77343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-othkwfvuxchkmnlhxzbpyekxddhxssbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223877.3638184-34-65140372860751/AnsiballZ_dnf.py'
Dec 08 19:57:58 compute-0 sudo[77343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:57:58 compute-0 python3.9[77345]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 08 19:58:00 compute-0 sshd-session[77347]: Received disconnect from 172.190.42.55 port 42134:11: Bye Bye [preauth]
Dec 08 19:58:00 compute-0 sshd-session[77347]: Disconnected from authenticating user root 172.190.42.55 port 42134 [preauth]
Dec 08 19:58:00 compute-0 sudo[77343]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:00 compute-0 python3.9[77498]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:58:02 compute-0 python3.9[77649]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 19:58:03 compute-0 python3.9[77799]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:58:03 compute-0 python3.9[77949]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:58:04 compute-0 sshd-session[76955]: Connection closed by 192.168.122.30 port 35304
Dec 08 19:58:04 compute-0 sshd-session[76952]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:58:04 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Dec 08 19:58:04 compute-0 systemd[1]: session-17.scope: Consumed 6.042s CPU time.
Dec 08 19:58:04 compute-0 systemd-logind[793]: Session 17 logged out. Waiting for processes to exit.
Dec 08 19:58:04 compute-0 systemd-logind[793]: Removed session 17.
Dec 08 19:58:10 compute-0 sshd-session[77974]: Accepted publickey for zuul from 192.168.122.30 port 55598 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:58:10 compute-0 systemd-logind[793]: New session 18 of user zuul.
Dec 08 19:58:10 compute-0 systemd[1]: Started Session 18 of User zuul.
Dec 08 19:58:10 compute-0 sshd-session[77974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:58:11 compute-0 python3.9[78127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:58:12 compute-0 sudo[78283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbextfhagurcsgryyodkbnlnkeuertkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223892.4827752-50-236428214117687/AnsiballZ_file.py'
Dec 08 19:58:12 compute-0 sudo[78283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:13 compute-0 python3.9[78285]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:13 compute-0 sudo[78283]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:13 compute-0 sudo[78435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atoycyngxxgmloqbrvemimjnhofdyapz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223893.5194361-50-215953978138192/AnsiballZ_file.py'
Dec 08 19:58:13 compute-0 sudo[78435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:14 compute-0 python3.9[78437]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:14 compute-0 sudo[78435]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:14 compute-0 sudo[78587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mclpapsmlrwzmcdbmqsqirccelfppygo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223894.2807674-65-16672761067961/AnsiballZ_stat.py'
Dec 08 19:58:14 compute-0 sudo[78587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:14 compute-0 python3.9[78589]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:14 compute-0 sudo[78587]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:15 compute-0 sudo[78710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juxztzkotmuqhaqgifjrhnoxvvsagbug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223894.2807674-65-16672761067961/AnsiballZ_copy.py'
Dec 08 19:58:15 compute-0 sudo[78710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:15 compute-0 python3.9[78712]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223894.2807674-65-16672761067961/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=00a4dab14730c784aff2c174ed67246673972927 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:15 compute-0 sudo[78710]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:16 compute-0 sudo[78862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozhsuetwcysnguilqnobssoufnyteauq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223895.9807196-65-178030292418191/AnsiballZ_stat.py'
Dec 08 19:58:16 compute-0 sudo[78862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:16 compute-0 python3.9[78864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:16 compute-0 sudo[78862]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:16 compute-0 sudo[78985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nosrvzqrizlzimonwkziudsujpprznui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223895.9807196-65-178030292418191/AnsiballZ_copy.py'
Dec 08 19:58:16 compute-0 sudo[78985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:17 compute-0 python3.9[78987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223895.9807196-65-178030292418191/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=e626ede3b8be210a8d3eaafd0bcb26235dcd41b1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:17 compute-0 sudo[78985]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:17 compute-0 sudo[79137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuvpadwxsxfzvccbagopddaibwjvymyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223897.226386-65-196354929838190/AnsiballZ_stat.py'
Dec 08 19:58:17 compute-0 sudo[79137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:17 compute-0 python3.9[79139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:17 compute-0 sudo[79137]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:18 compute-0 sudo[79260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbkgganuzpdfulddxdasyhnneaxuidbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223897.226386-65-196354929838190/AnsiballZ_copy.py'
Dec 08 19:58:18 compute-0 sudo[79260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:18 compute-0 python3.9[79262]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223897.226386-65-196354929838190/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=e05820b3c7ba86674bbd4a4cb71ca0345bc7418a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:18 compute-0 sudo[79260]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:18 compute-0 sudo[79412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crnvwgjmdaeuiepjublrrluhnsbclcvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223898.5352328-109-80133943385698/AnsiballZ_file.py'
Dec 08 19:58:18 compute-0 sudo[79412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:19 compute-0 python3.9[79414]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:19 compute-0 sudo[79412]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:19 compute-0 sudo[79564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezdeowmvqpmgszmldchffmiwkouknwhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223899.2575023-109-85832648584438/AnsiballZ_file.py'
Dec 08 19:58:19 compute-0 sudo[79564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:19 compute-0 python3.9[79566]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:19 compute-0 sudo[79564]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:20 compute-0 sudo[79718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilusgzatcqaglxgpsvotmhahdwpqeaml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223899.9735603-124-150689094993764/AnsiballZ_stat.py'
Dec 08 19:58:20 compute-0 sudo[79718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:20 compute-0 sshd-session[79570]: Received disconnect from 159.223.8.81 port 48394:11: Bye Bye [preauth]
Dec 08 19:58:20 compute-0 sshd-session[79570]: Disconnected from authenticating user root 159.223.8.81 port 48394 [preauth]
Dec 08 19:58:20 compute-0 python3.9[79720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:20 compute-0 sudo[79718]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:20 compute-0 sudo[79841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwtanrmekstqpuvusqrmvnddjkqatxoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223899.9735603-124-150689094993764/AnsiballZ_copy.py'
Dec 08 19:58:20 compute-0 sudo[79841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:21 compute-0 python3.9[79843]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223899.9735603-124-150689094993764/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=be80bf50add81e0948017eef4d780b57303fdca3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:21 compute-0 sudo[79841]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:21 compute-0 sudo[79993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcaeexkxxdlpfggjhygxdyhvsuzshxvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223901.271685-124-50744757546756/AnsiballZ_stat.py'
Dec 08 19:58:21 compute-0 sudo[79993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:21 compute-0 python3.9[79995]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:21 compute-0 sudo[79993]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:22 compute-0 sudo[80116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjluruyzplneppvqhaenciwtgmdvfrsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223901.271685-124-50744757546756/AnsiballZ_copy.py'
Dec 08 19:58:22 compute-0 sudo[80116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:22 compute-0 python3.9[80118]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223901.271685-124-50744757546756/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=7f59be47b3e23e5a90fc1c928cb31b5f50e06728 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:22 compute-0 sudo[80116]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:22 compute-0 sudo[80268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isjyjlejkfwmfoxjhglsktqmzqkoklgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223902.5960267-124-99976403008589/AnsiballZ_stat.py'
Dec 08 19:58:22 compute-0 sudo[80268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:23 compute-0 python3.9[80270]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:23 compute-0 sudo[80268]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:23 compute-0 sudo[80391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eeuheticbaqacxwyjbgqlndsydgnphxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223902.5960267-124-99976403008589/AnsiballZ_copy.py'
Dec 08 19:58:23 compute-0 sudo[80391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:23 compute-0 python3.9[80393]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223902.5960267-124-99976403008589/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=a5a5ea0daae8a819695e8f16818c59c4752e62a4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:23 compute-0 sudo[80391]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:24 compute-0 sudo[80543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvziyotmwedgydqfxydprnnxtiepirbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223903.8600738-168-141895104704408/AnsiballZ_file.py'
Dec 08 19:58:24 compute-0 sudo[80543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:24 compute-0 python3.9[80545]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:24 compute-0 sudo[80543]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:24 compute-0 sudo[80695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ismfujmfugmdahettwkvcukpedjqrgfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223904.6176493-168-30807693911679/AnsiballZ_file.py'
Dec 08 19:58:24 compute-0 sudo[80695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:25 compute-0 python3.9[80697]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:25 compute-0 sudo[80695]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:25 compute-0 sudo[80847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmbfsfyzghykdrrmbnertwwnppqxcjab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223905.3682048-183-57042999046536/AnsiballZ_stat.py'
Dec 08 19:58:25 compute-0 sudo[80847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:25 compute-0 python3.9[80849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:25 compute-0 sudo[80847]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:26 compute-0 sudo[80970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duukbwjaylmhsmyclfqmwjczzwapywoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223905.3682048-183-57042999046536/AnsiballZ_copy.py'
Dec 08 19:58:26 compute-0 sudo[80970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:26 compute-0 python3.9[80972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223905.3682048-183-57042999046536/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=2414ad19b4e497193f2882544d2f6f18cfc211b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:26 compute-0 sudo[80970]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:26 compute-0 sudo[81122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvafgtgdpkpqcfctimrojhyzfyotdkam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223906.5855129-183-168927561369593/AnsiballZ_stat.py'
Dec 08 19:58:26 compute-0 sudo[81122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:27 compute-0 python3.9[81124]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:27 compute-0 sudo[81122]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:27 compute-0 sudo[81245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tfqhwidwyvgelqnbjvwkjdctrccmiiwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223906.5855129-183-168927561369593/AnsiballZ_copy.py'
Dec 08 19:58:27 compute-0 sudo[81245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:27 compute-0 python3.9[81247]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223906.5855129-183-168927561369593/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f527134524422da855356b0b79eac832f1319e86 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:27 compute-0 sudo[81245]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:28 compute-0 sudo[81397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upwkinrxrckvmpacsrieqqkwaeqhlnmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223907.8160336-183-254412273272355/AnsiballZ_stat.py'
Dec 08 19:58:28 compute-0 sudo[81397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:28 compute-0 python3.9[81399]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:28 compute-0 sudo[81397]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:28 compute-0 sudo[81520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpxjwtcqjehygfbjjusiaybqtvjzbogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223907.8160336-183-254412273272355/AnsiballZ_copy.py'
Dec 08 19:58:28 compute-0 sudo[81520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:28 compute-0 python3.9[81522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223907.8160336-183-254412273272355/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=7e2b0c073f78f043139c269a89488ddf47f99989 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:28 compute-0 sudo[81520]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:29 compute-0 sudo[81672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivebjtefcsvtskoinhobffpgcfcqikdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223909.112127-227-122593698398090/AnsiballZ_file.py'
Dec 08 19:58:29 compute-0 sudo[81672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:29 compute-0 python3.9[81674]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:29 compute-0 sudo[81672]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:30 compute-0 sudo[81824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcjdbjjcnkxvwvtzmcuokbnumqhfnawx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223909.78637-227-237658565995457/AnsiballZ_file.py'
Dec 08 19:58:30 compute-0 sudo[81824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:30 compute-0 python3.9[81826]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:30 compute-0 sudo[81824]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:31 compute-0 sudo[81976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxenmiyefxjzodcppkduylltkqaxjusv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223911.0461166-242-142540434296599/AnsiballZ_stat.py'
Dec 08 19:58:31 compute-0 sudo[81976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:31 compute-0 python3.9[81978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:31 compute-0 sudo[81976]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:31 compute-0 sudo[82099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozpyhhqulxuemjsbdferzsnwudgjlawr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223911.0461166-242-142540434296599/AnsiballZ_copy.py'
Dec 08 19:58:31 compute-0 sudo[82099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:32 compute-0 python3.9[82101]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223911.0461166-242-142540434296599/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=c1f305804951bf563326d28e45f533cb40845402 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:32 compute-0 sudo[82099]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:32 compute-0 sudo[82251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzmaxgiwzyxxrniuhhoehnnmypoazrwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223912.3138435-242-276894079977365/AnsiballZ_stat.py'
Dec 08 19:58:32 compute-0 sudo[82251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:32 compute-0 python3.9[82253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:32 compute-0 sudo[82251]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:33 compute-0 sudo[82374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsdrclvmhaekybbwnqyhqftoqoaotzbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223912.3138435-242-276894079977365/AnsiballZ_copy.py'
Dec 08 19:58:33 compute-0 sudo[82374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:33 compute-0 python3.9[82376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223912.3138435-242-276894079977365/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=f527134524422da855356b0b79eac832f1319e86 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:33 compute-0 sudo[82374]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:33 compute-0 sudo[82526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpbhxhwrgccchsstvdmvacaokurrksui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223913.512852-242-227460713914204/AnsiballZ_stat.py'
Dec 08 19:58:33 compute-0 sudo[82526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:33 compute-0 python3.9[82528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:33 compute-0 sudo[82526]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:34 compute-0 sudo[82649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnsibsapolmbmuwugcxzwzhhzwxtvhls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223913.512852-242-227460713914204/AnsiballZ_copy.py'
Dec 08 19:58:34 compute-0 sudo[82649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:34 compute-0 python3.9[82651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223913.512852-242-227460713914204/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=760fc1e314169b574a2ffab4841cc1022f7ebdb5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:34 compute-0 sudo[82649]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:35 compute-0 sudo[82801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvdfzwtezdwihoaehhcpsdwoflielisl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223915.2223647-302-44205668564435/AnsiballZ_file.py'
Dec 08 19:58:35 compute-0 sudo[82801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:35 compute-0 python3.9[82803]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:35 compute-0 sudo[82801]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:36 compute-0 sudo[82953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uadjhxbxbzsvdturkqorptmdalindtfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223915.9850664-310-203067394155709/AnsiballZ_stat.py'
Dec 08 19:58:36 compute-0 sudo[82953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:36 compute-0 python3.9[82955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:36 compute-0 sudo[82953]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:36 compute-0 sudo[83076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lndwidixzwhwccgtfxufeazmdwqipdst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223915.9850664-310-203067394155709/AnsiballZ_copy.py'
Dec 08 19:58:36 compute-0 sudo[83076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:37 compute-0 python3.9[83078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223915.9850664-310-203067394155709/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:37 compute-0 sudo[83076]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:37 compute-0 sudo[83228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wccvphykbhsmgqjxhyfrrzqjbwhxjydp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223917.3947601-326-247935072477004/AnsiballZ_file.py'
Dec 08 19:58:37 compute-0 sudo[83228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:37 compute-0 python3.9[83230]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:37 compute-0 sudo[83228]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:38 compute-0 sudo[83380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmfsstbrlreyaoyxvmhnhobczanwikac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223918.1136565-334-25962314322719/AnsiballZ_stat.py'
Dec 08 19:58:38 compute-0 sudo[83380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:38 compute-0 python3.9[83382]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:38 compute-0 sudo[83380]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:39 compute-0 sudo[83503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znbftwcekggsnazesheqvfpummenrvbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223918.1136565-334-25962314322719/AnsiballZ_copy.py'
Dec 08 19:58:39 compute-0 sudo[83503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:39 compute-0 python3.9[83505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223918.1136565-334-25962314322719/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:39 compute-0 sudo[83503]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:39 compute-0 chronyd[65682]: Selected source 162.159.200.123 (pool.ntp.org)
Dec 08 19:58:39 compute-0 sudo[83655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivwrnohoqumfusztcsuesptjpkjiadbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223919.4450874-350-136455118415268/AnsiballZ_file.py'
Dec 08 19:58:39 compute-0 sudo[83655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:39 compute-0 python3.9[83657]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:39 compute-0 sudo[83655]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:40 compute-0 sudo[83807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgexovfyxdfhkisdjevtvjcgdyafxvsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223920.1590316-358-52470876463987/AnsiballZ_stat.py'
Dec 08 19:58:40 compute-0 sudo[83807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:40 compute-0 python3.9[83809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:40 compute-0 sudo[83807]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:41 compute-0 sudo[83930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofzfbjrgjuuswclszlsfdpvloyyefgfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223920.1590316-358-52470876463987/AnsiballZ_copy.py'
Dec 08 19:58:41 compute-0 sudo[83930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:41 compute-0 python3.9[83932]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223920.1590316-358-52470876463987/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:41 compute-0 sudo[83930]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:41 compute-0 sudo[84082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsulvjhaitugeqlfpaccnfzrltwiensz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223921.5049303-374-163617136997648/AnsiballZ_file.py'
Dec 08 19:58:41 compute-0 sudo[84082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:42 compute-0 python3.9[84084]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:42 compute-0 sudo[84082]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:42 compute-0 sudo[84234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yalseojerjdtvlhivdcqoyzdbqdtgglk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223922.2742124-382-246768050463519/AnsiballZ_stat.py'
Dec 08 19:58:42 compute-0 sudo[84234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:42 compute-0 python3.9[84236]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:42 compute-0 sudo[84234]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:43 compute-0 sudo[84357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfbzgysyvpprdxjppemzsgkgimzxejcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223922.2742124-382-246768050463519/AnsiballZ_copy.py'
Dec 08 19:58:43 compute-0 sudo[84357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:43 compute-0 python3.9[84359]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223922.2742124-382-246768050463519/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:43 compute-0 sudo[84357]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:43 compute-0 sudo[84509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxzuiaokojqkhtbnkywwyamihisgkkeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223923.6407955-398-275072014543348/AnsiballZ_file.py'
Dec 08 19:58:43 compute-0 sudo[84509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:44 compute-0 python3.9[84511]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:44 compute-0 sudo[84509]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:44 compute-0 sudo[84661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrfwxjvowmqdchtlwauyeeedroommqjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223924.3739588-406-83026441442545/AnsiballZ_stat.py'
Dec 08 19:58:44 compute-0 sudo[84661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:44 compute-0 python3.9[84663]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:44 compute-0 sudo[84661]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:45 compute-0 sudo[84784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apoubueavpbfsxgpqugwfdzodcsnpltd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223924.3739588-406-83026441442545/AnsiballZ_copy.py'
Dec 08 19:58:45 compute-0 sudo[84784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:45 compute-0 python3.9[84786]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223924.3739588-406-83026441442545/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:45 compute-0 sudo[84784]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:46 compute-0 sudo[84936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owirheerdwauicjkiicohxeusnrnxpff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223925.7423623-422-253435220695683/AnsiballZ_file.py'
Dec 08 19:58:46 compute-0 sudo[84936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:46 compute-0 python3.9[84938]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:46 compute-0 sudo[84936]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:46 compute-0 sudo[85088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xsfiofttbzjybiznsertbecsxvpzognp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223926.4427524-430-271610835787666/AnsiballZ_stat.py'
Dec 08 19:58:46 compute-0 sudo[85088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:46 compute-0 python3.9[85090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:46 compute-0 sudo[85088]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:47 compute-0 sudo[85211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sllnzfnyzzzfzahbnrsmjyxfhowdjpog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223926.4427524-430-271610835787666/AnsiballZ_copy.py'
Dec 08 19:58:47 compute-0 sudo[85211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:47 compute-0 python3.9[85213]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223926.4427524-430-271610835787666/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:47 compute-0 sudo[85211]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:48 compute-0 sudo[85363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhkzbrjnsctzukdyykizgyvhgtalsfqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223927.7729688-446-116707563995203/AnsiballZ_file.py'
Dec 08 19:58:48 compute-0 sudo[85363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:48 compute-0 python3.9[85365]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:48 compute-0 sudo[85363]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:48 compute-0 sudo[85515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqnudkbknpzfqrhptnrhuyujbuuqufem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223928.4434023-454-24472129428631/AnsiballZ_stat.py'
Dec 08 19:58:48 compute-0 sudo[85515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:48 compute-0 python3.9[85517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:58:48 compute-0 sudo[85515]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:49 compute-0 sudo[85638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyrfsdfxlfjccrwdtvdvkqtqkwsnuccz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223928.4434023-454-24472129428631/AnsiballZ_copy.py'
Dec 08 19:58:49 compute-0 sudo[85638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:49 compute-0 python3.9[85640]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223928.4434023-454-24472129428631/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=e0eb11d12c0f18deeac27fe895302cd1709bd197 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:58:49 compute-0 sudo[85638]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:49 compute-0 sshd-session[77977]: Connection closed by 192.168.122.30 port 55598
Dec 08 19:58:49 compute-0 sshd-session[77974]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:58:49 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Dec 08 19:58:49 compute-0 systemd[1]: session-18.scope: Consumed 31.377s CPU time.
Dec 08 19:58:49 compute-0 systemd-logind[793]: Session 18 logged out. Waiting for processes to exit.
Dec 08 19:58:49 compute-0 systemd-logind[793]: Removed session 18.
Dec 08 19:58:55 compute-0 sshd-session[85665]: Accepted publickey for zuul from 192.168.122.30 port 56428 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 19:58:55 compute-0 systemd-logind[793]: New session 19 of user zuul.
Dec 08 19:58:55 compute-0 systemd[1]: Started Session 19 of User zuul.
Dec 08 19:58:55 compute-0 sshd-session[85665]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 19:58:57 compute-0 python3.9[85818]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:58:58 compute-0 sudo[85972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtpqpxxqifwjtchvvruipxfqcfkskevq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223937.5871153-34-43160877753932/AnsiballZ_file.py'
Dec 08 19:58:58 compute-0 sudo[85972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:58 compute-0 python3.9[85974]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:58 compute-0 sudo[85972]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:58 compute-0 sudo[86124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkrkxcsifedmxohmoryaseibzfpzdnbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223938.450562-34-13026285983471/AnsiballZ_file.py'
Dec 08 19:58:58 compute-0 sudo[86124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:58:58 compute-0 python3.9[86126]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:58:58 compute-0 sudo[86124]: pam_unix(sudo:session): session closed for user root
Dec 08 19:58:59 compute-0 python3.9[86276]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:59:00 compute-0 sudo[86426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-escwdnlhijoitdddsxrzzqaprmkuehbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223939.9467285-57-207057832373388/AnsiballZ_seboolean.py'
Dec 08 19:59:00 compute-0 sudo[86426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:00 compute-0 python3.9[86428]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 08 19:59:02 compute-0 sudo[86426]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:02 compute-0 sudo[86582]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orpgnqarnkvywimpmhqxswqnscxtznry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223942.319395-67-198345800210868/AnsiballZ_setup.py'
Dec 08 19:59:02 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 08 19:59:02 compute-0 sudo[86582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:02 compute-0 python3.9[86584]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 19:59:03 compute-0 sudo[86582]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:03 compute-0 sudo[86666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fddnccwjjsiunyhdaikxpczqvwyepqzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223942.319395-67-198345800210868/AnsiballZ_dnf.py'
Dec 08 19:59:03 compute-0 sudo[86666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:03 compute-0 python3.9[86668]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 19:59:05 compute-0 sudo[86666]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:05 compute-0 sudo[86819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmlqwsdsuedaartpzlsbgmgyrrkijbal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223945.3524313-79-95107484363593/AnsiballZ_systemd.py'
Dec 08 19:59:05 compute-0 sudo[86819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:06 compute-0 python3.9[86821]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 19:59:06 compute-0 sudo[86819]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:06 compute-0 sudo[86974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyaszwqikwboabttmunlhcdxdbquydsc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765223946.514817-87-211242218783525/AnsiballZ_edpm_nftables_snippet.py'
Dec 08 19:59:06 compute-0 sudo[86974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:07 compute-0 python3[86976]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 08 19:59:07 compute-0 sudo[86974]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:07 compute-0 sudo[87126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkwlzdhadurrjvlnadcjrxivqdfmhekr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223947.457101-96-100307572184549/AnsiballZ_file.py'
Dec 08 19:59:07 compute-0 sudo[87126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:07 compute-0 python3.9[87128]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:07 compute-0 sudo[87126]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:08 compute-0 sudo[87278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxlpfmhbyruodqqglgmtotbshuiyiujd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223948.2496614-104-244122474373011/AnsiballZ_stat.py'
Dec 08 19:59:08 compute-0 sudo[87278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:08 compute-0 python3.9[87280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:08 compute-0 sudo[87278]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:09 compute-0 sudo[87356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqaxddggfhesjjllmfxzhtupchskqhhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223948.2496614-104-244122474373011/AnsiballZ_file.py'
Dec 08 19:59:09 compute-0 sudo[87356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:09 compute-0 python3.9[87358]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:09 compute-0 sudo[87356]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:09 compute-0 sudo[87508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azhuibxtphzmzclftcuksshwtifpraqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223949.7129407-116-102701752872557/AnsiballZ_stat.py'
Dec 08 19:59:09 compute-0 sudo[87508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:10 compute-0 python3.9[87510]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:10 compute-0 sudo[87508]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:10 compute-0 sudo[87586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-corkxgssjyoskyniflevlyncxklayxnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223949.7129407-116-102701752872557/AnsiballZ_file.py'
Dec 08 19:59:10 compute-0 sudo[87586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:10 compute-0 python3.9[87588]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.1b70dy5x recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:10 compute-0 sudo[87586]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:11 compute-0 sudo[87738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehbhjgczvtvvmjmuqlbrwmisdlwllroc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223950.824202-128-243243462114336/AnsiballZ_stat.py'
Dec 08 19:59:11 compute-0 sudo[87738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:11 compute-0 python3.9[87740]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:11 compute-0 sudo[87738]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:11 compute-0 sudo[87816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmjulwnrjjblsdyepzwrwwbplkdiwzlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223950.824202-128-243243462114336/AnsiballZ_file.py'
Dec 08 19:59:11 compute-0 sudo[87816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:11 compute-0 python3.9[87818]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:11 compute-0 sudo[87816]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:12 compute-0 sudo[87968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofkpsgahondutjgpdgqgbwsydqccahlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223952.0968318-141-197851544511124/AnsiballZ_command.py'
Dec 08 19:59:12 compute-0 sudo[87968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:12 compute-0 python3.9[87970]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:12 compute-0 sudo[87968]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:13 compute-0 sudo[88121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxvkpifimbpdyurdxkiwklawcmkdrxvh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765223953.0714557-149-209752231453997/AnsiballZ_edpm_nftables_from_files.py'
Dec 08 19:59:13 compute-0 sudo[88121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:13 compute-0 python3[88123]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 08 19:59:13 compute-0 sudo[88121]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:14 compute-0 sudo[88273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-euxzthpwkwshcnmrryzkigtyusfzsugf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223953.8990753-157-13918865303277/AnsiballZ_stat.py'
Dec 08 19:59:14 compute-0 sudo[88273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:14 compute-0 python3.9[88275]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:14 compute-0 sudo[88273]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:14 compute-0 sudo[88398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrynemmwcnmupgtfepgjkhekdtdppaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223953.8990753-157-13918865303277/AnsiballZ_copy.py'
Dec 08 19:59:14 compute-0 sudo[88398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:15 compute-0 python3.9[88400]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223953.8990753-157-13918865303277/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:15 compute-0 sudo[88398]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:15 compute-0 sudo[88550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djvgytmhpkyxjutgaivxrmhqfimljoad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223955.3195374-172-204502325429432/AnsiballZ_stat.py'
Dec 08 19:59:15 compute-0 sudo[88550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:15 compute-0 python3.9[88552]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:15 compute-0 sudo[88550]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:16 compute-0 sudo[88675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvnpbpsanruuyrmxwlovjrwwksmiaziw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223955.3195374-172-204502325429432/AnsiballZ_copy.py'
Dec 08 19:59:16 compute-0 sudo[88675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:16 compute-0 python3.9[88677]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223955.3195374-172-204502325429432/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:16 compute-0 sudo[88675]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:16 compute-0 sudo[88827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssogxiprqtdgsutgnjbnpnilqavdbjms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223956.6463547-187-186843460587923/AnsiballZ_stat.py'
Dec 08 19:59:16 compute-0 sudo[88827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:17 compute-0 python3.9[88829]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:17 compute-0 sudo[88827]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:17 compute-0 sudo[88954]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jndbpoumpcygczbzpvubueetnufysvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223956.6463547-187-186843460587923/AnsiballZ_copy.py'
Dec 08 19:59:17 compute-0 sudo[88954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:17 compute-0 sshd-session[88935]: Invalid user front-user from 172.190.42.55 port 47872
Dec 08 19:59:17 compute-0 sshd-session[88935]: Received disconnect from 172.190.42.55 port 47872:11: Bye Bye [preauth]
Dec 08 19:59:17 compute-0 sshd-session[88935]: Disconnected from invalid user front-user 172.190.42.55 port 47872 [preauth]
Dec 08 19:59:17 compute-0 python3.9[88956]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223956.6463547-187-186843460587923/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:17 compute-0 sudo[88954]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:18 compute-0 sudo[89106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ollkepttkhnzgbvmepqqdgzvbnakbgjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223958.1382132-202-209193706612145/AnsiballZ_stat.py'
Dec 08 19:59:18 compute-0 sudo[89106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:18 compute-0 python3.9[89108]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:18 compute-0 sudo[89106]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:19 compute-0 sudo[89231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vodddobspmqxkvxpvwtycjctrmvtgero ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223958.1382132-202-209193706612145/AnsiballZ_copy.py'
Dec 08 19:59:19 compute-0 sudo[89231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:19 compute-0 python3.9[89233]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223958.1382132-202-209193706612145/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:19 compute-0 sudo[89231]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:20 compute-0 sudo[89385]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnptwrtfpnrlpzocpugbpxikornyknkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223959.5936654-217-190111764325663/AnsiballZ_stat.py'
Dec 08 19:59:20 compute-0 sudo[89385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:20 compute-0 python3.9[89387]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:20 compute-0 sudo[89385]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:20 compute-0 sshd-session[89310]: Received disconnect from 159.223.8.81 port 36602:11: Bye Bye [preauth]
Dec 08 19:59:20 compute-0 sshd-session[89310]: Disconnected from authenticating user root 159.223.8.81 port 36602 [preauth]
Dec 08 19:59:20 compute-0 sudo[89510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yamkmfkuknkoptadqqskaycgfxyzumrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223959.5936654-217-190111764325663/AnsiballZ_copy.py'
Dec 08 19:59:20 compute-0 sudo[89510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:20 compute-0 python3.9[89512]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765223959.5936654-217-190111764325663/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:20 compute-0 sudo[89510]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:21 compute-0 sudo[89662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nshngmapfdlcespzgbvkbsjnuzhdlife ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223961.055956-232-265690258942283/AnsiballZ_file.py'
Dec 08 19:59:21 compute-0 sudo[89662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:21 compute-0 python3.9[89664]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:21 compute-0 sudo[89662]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:22 compute-0 sudo[89814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luhyhaezcyzuxhwioovndvggweehbvpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223961.8105621-240-120241610204541/AnsiballZ_command.py'
Dec 08 19:59:22 compute-0 sudo[89814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:22 compute-0 python3.9[89816]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:22 compute-0 sudo[89814]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:23 compute-0 sudo[89969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vanrzjlpoozsxylnxeueytwaksratmng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223962.5774908-248-62867272619728/AnsiballZ_blockinfile.py'
Dec 08 19:59:23 compute-0 sudo[89969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:23 compute-0 python3.9[89971]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:23 compute-0 sudo[89969]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:23 compute-0 sudo[90121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytcryjialbxvmckaeixttoqpywiqmrqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223963.445894-257-942544610098/AnsiballZ_command.py'
Dec 08 19:59:23 compute-0 sudo[90121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:23 compute-0 python3.9[90123]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:23 compute-0 sudo[90121]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:24 compute-0 sudo[90275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfsioppkbffwxngywykrfmiwqfncyjdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223964.1725347-265-246419172693597/AnsiballZ_stat.py'
Dec 08 19:59:24 compute-0 sudo[90275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:24 compute-0 python3.9[90277]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:59:24 compute-0 sudo[90275]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:25 compute-0 sudo[90429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qoxaugmmaiasomwfydpuamrysmfqzpls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223964.8876307-273-26250395905142/AnsiballZ_command.py'
Dec 08 19:59:25 compute-0 sudo[90429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:25 compute-0 python3.9[90431]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:25 compute-0 sudo[90429]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:25 compute-0 sudo[90584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpxbanqmidpuynynfqlotsiiuttycmeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223965.624635-281-141369428589652/AnsiballZ_file.py'
Dec 08 19:59:25 compute-0 sudo[90584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:26 compute-0 python3.9[90586]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:26 compute-0 sudo[90584]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:27 compute-0 python3.9[90736]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 19:59:28 compute-0 sudo[90887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyhvafqupbmntqelzkmpxizzpqtnalip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223967.8954687-321-240280818735979/AnsiballZ_command.py'
Dec 08 19:59:28 compute-0 sudo[90887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:28 compute-0 python3.9[90889]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:2e:0a:f2:93:49:d5" external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:28 compute-0 ovs-vsctl[90890]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:2e:0a:f2:93:49:d5 external_ids:ovn-encap-ip=172.19.0.100 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 08 19:59:28 compute-0 sudo[90887]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:28 compute-0 sudo[91040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-devkbnooftiatfteifdhkyymdiyfkakp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223968.6172945-330-12635053721067/AnsiballZ_command.py'
Dec 08 19:59:28 compute-0 sudo[91040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:29 compute-0 python3.9[91042]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:29 compute-0 sudo[91040]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:29 compute-0 sudo[91195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlwogtvjsaqlnrxnllguplrsuhliifam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223969.378645-338-240246802276370/AnsiballZ_command.py'
Dec 08 19:59:29 compute-0 sudo[91195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:29 compute-0 python3.9[91197]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:29 compute-0 ovs-vsctl[91198]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 08 19:59:29 compute-0 sudo[91195]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:30 compute-0 python3.9[91348]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:59:31 compute-0 sudo[91501]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sagfzutmcbxzbuvggootjbsjzwlcptmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223970.8362122-355-175386410951114/AnsiballZ_file.py'
Dec 08 19:59:31 compute-0 sudo[91501]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:31 compute-0 python3.9[91503]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:31 compute-0 sudo[91501]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:31 compute-0 sudo[91653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjanrwiddfacgxxxfhxbaoyoggujtwdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223971.595969-363-108189865239231/AnsiballZ_stat.py'
Dec 08 19:59:31 compute-0 sudo[91653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:32 compute-0 python3.9[91655]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:32 compute-0 sudo[91653]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:32 compute-0 sudo[91731]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pljodmcrnciqgdnrqstjzvvkylkgmcpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223971.595969-363-108189865239231/AnsiballZ_file.py'
Dec 08 19:59:32 compute-0 sudo[91731]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:32 compute-0 python3.9[91733]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:32 compute-0 sudo[91731]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:32 compute-0 sudo[91883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlwiumjrndnkwtaqbsuusjppggxtvpce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223972.6734025-363-139958408901623/AnsiballZ_stat.py'
Dec 08 19:59:32 compute-0 sudo[91883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:33 compute-0 python3.9[91885]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:33 compute-0 sudo[91883]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:33 compute-0 sudo[91963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqfxzmvrokdczstgytxbkyycbtaxoesq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223972.6734025-363-139958408901623/AnsiballZ_file.py'
Dec 08 19:59:33 compute-0 sudo[91963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:33 compute-0 python3.9[91965]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:33 compute-0 sudo[91963]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:34 compute-0 sudo[92115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jucatthufdnfhhcrpsdjofqcjnfbardo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223973.7545793-386-182187392968118/AnsiballZ_file.py'
Dec 08 19:59:34 compute-0 sudo[92115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:34 compute-0 python3.9[92117]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:34 compute-0 sudo[92115]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:34 compute-0 sshd-session[91935]: Invalid user soporte from 222.172.32.246 port 2175
Dec 08 19:59:34 compute-0 sudo[92267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvikrfaumnqohvbilmznnnzexzfztrtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223974.470654-394-173864502749492/AnsiballZ_stat.py'
Dec 08 19:59:34 compute-0 sudo[92267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:34 compute-0 python3.9[92269]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:34 compute-0 sudo[92267]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:35 compute-0 sudo[92345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvngipzrznyujudtnfpbntbvxchywjhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223974.470654-394-173864502749492/AnsiballZ_file.py'
Dec 08 19:59:35 compute-0 sudo[92345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:35 compute-0 python3.9[92347]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:35 compute-0 sudo[92345]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:35 compute-0 sudo[92497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efqcgsedynrtwqgajsufsmeiqnardfjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223975.5996609-406-28511632991463/AnsiballZ_stat.py'
Dec 08 19:59:35 compute-0 sudo[92497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:36 compute-0 python3.9[92499]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:36 compute-0 sudo[92497]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:36 compute-0 sudo[92575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfarizzluavjxxbqedlziulkegptomjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223975.5996609-406-28511632991463/AnsiballZ_file.py'
Dec 08 19:59:36 compute-0 sudo[92575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:36 compute-0 python3.9[92577]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:36 compute-0 sudo[92575]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:36 compute-0 sudo[92727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpwcwkqnkvashncdlidtrrriwouqtmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223976.7142994-418-214986511830134/AnsiballZ_systemd.py'
Dec 08 19:59:36 compute-0 sudo[92727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:37 compute-0 python3.9[92729]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:59:37 compute-0 systemd[1]: Reloading.
Dec 08 19:59:37 compute-0 systemd-sysv-generator[92760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:59:37 compute-0 systemd-rc-local-generator[92756]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:59:37 compute-0 sudo[92727]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:38 compute-0 sudo[92916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltynnuneavwtwvvrwgctcqnecfvffitt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223977.8231854-426-274492019170155/AnsiballZ_stat.py'
Dec 08 19:59:38 compute-0 sudo[92916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:38 compute-0 python3.9[92918]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:38 compute-0 sudo[92916]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:38 compute-0 sudo[92994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfuablbklzjpbqzynsvxqksupzdyncwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223977.8231854-426-274492019170155/AnsiballZ_file.py'
Dec 08 19:59:38 compute-0 sudo[92994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:38 compute-0 python3.9[92996]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:38 compute-0 sudo[92994]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:39 compute-0 sudo[93146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfucytzzchmzjscnsspcvtrerpcquear ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223978.93514-438-205705033053845/AnsiballZ_stat.py'
Dec 08 19:59:39 compute-0 sudo[93146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:39 compute-0 python3.9[93148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:39 compute-0 sudo[93146]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:39 compute-0 sudo[93224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiakmqiypbcjrmvkplcdhsahkxryoymi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223978.93514-438-205705033053845/AnsiballZ_file.py'
Dec 08 19:59:39 compute-0 sudo[93224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:39 compute-0 python3.9[93226]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:39 compute-0 sudo[93224]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:40 compute-0 sudo[93376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imjxpovhimjuqyweacfzhqzjfywobjbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223980.1011262-450-209823536550358/AnsiballZ_systemd.py'
Dec 08 19:59:40 compute-0 sudo[93376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:40 compute-0 python3.9[93378]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:59:40 compute-0 systemd[1]: Reloading.
Dec 08 19:59:40 compute-0 systemd-sysv-generator[93407]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:59:40 compute-0 systemd-rc-local-generator[93404]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:59:40 compute-0 systemd[1]: Starting Create netns directory...
Dec 08 19:59:40 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 08 19:59:40 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 08 19:59:40 compute-0 systemd[1]: Finished Create netns directory.
Dec 08 19:59:40 compute-0 sudo[93376]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:41 compute-0 sudo[93569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmlxqgxdfhubrujfimxoiecicldcsxmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223981.2360563-460-37465838006958/AnsiballZ_file.py'
Dec 08 19:59:41 compute-0 sudo[93569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:41 compute-0 python3.9[93571]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:41 compute-0 sudo[93569]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:42 compute-0 sudo[93721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvrpbsneaywtlvpgixzvxavpazauyfpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223982.1100347-468-63834870687469/AnsiballZ_stat.py'
Dec 08 19:59:42 compute-0 sudo[93721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:42 compute-0 python3.9[93723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:42 compute-0 sudo[93721]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:42 compute-0 sudo[93844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrskylnxuowrglyomfgbgeselubqpcfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223982.1100347-468-63834870687469/AnsiballZ_copy.py'
Dec 08 19:59:42 compute-0 sudo[93844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:43 compute-0 python3.9[93846]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765223982.1100347-468-63834870687469/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:43 compute-0 sudo[93844]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:43 compute-0 sudo[93996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdjarkqjskqahvetffwdnjwdaulexqfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223983.4480236-485-227736353942637/AnsiballZ_file.py'
Dec 08 19:59:43 compute-0 sudo[93996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:43 compute-0 python3.9[93998]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 19:59:43 compute-0 sudo[93996]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:44 compute-0 sudo[94148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shptdcfcmkbympdldcqmubyvtepkjxiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223984.1436913-493-44164004470886/AnsiballZ_stat.py'
Dec 08 19:59:44 compute-0 sudo[94148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:44 compute-0 python3.9[94150]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 19:59:44 compute-0 sudo[94148]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:45 compute-0 sudo[94271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmvzpglotrngitnjbksyoyjhdqzmsgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223984.1436913-493-44164004470886/AnsiballZ_copy.py'
Dec 08 19:59:45 compute-0 sudo[94271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:45 compute-0 python3.9[94273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765223984.1436913-493-44164004470886/.source.json _original_basename=.pu2hjlgn follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:45 compute-0 sudo[94271]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:45 compute-0 sudo[94423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifegtrmxscipggmgkjlpjreqxdoyibxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223985.6116006-508-262845015018073/AnsiballZ_file.py'
Dec 08 19:59:45 compute-0 sudo[94423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:46 compute-0 python3.9[94425]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:46 compute-0 sudo[94423]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:46 compute-0 sudo[94575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kopuiqjkmdqacrnfegljbccenroehkfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223986.305666-516-78424068890698/AnsiballZ_stat.py'
Dec 08 19:59:46 compute-0 sudo[94575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:46 compute-0 sudo[94575]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:47 compute-0 sudo[94698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfhmtaqczdbcazehtzpblufyujjkjwly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223986.305666-516-78424068890698/AnsiballZ_copy.py'
Dec 08 19:59:47 compute-0 sudo[94698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:47 compute-0 sudo[94698]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:48 compute-0 sudo[94850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozacoscbmxsunacipsakezonzcrhfpmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223987.6296575-533-256246308147621/AnsiballZ_container_config_data.py'
Dec 08 19:59:48 compute-0 sudo[94850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:48 compute-0 python3.9[94852]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 08 19:59:48 compute-0 sudo[94850]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:49 compute-0 sudo[95002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqhjjfwdtyqavarxdkwhaxlstvewcepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223988.6512094-542-62258342915464/AnsiballZ_container_config_hash.py'
Dec 08 19:59:49 compute-0 sudo[95002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:49 compute-0 python3.9[95004]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 19:59:49 compute-0 sudo[95002]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:50 compute-0 sudo[95154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofkulxowzhzblnfrtudjxhuctshgkmzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223989.5435553-551-188745488831171/AnsiballZ_podman_container_info.py'
Dec 08 19:59:50 compute-0 sudo[95154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:50 compute-0 python3.9[95156]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 08 19:59:50 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:59:50 compute-0 sudo[95154]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:51 compute-0 sudo[95317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmwktfnlqlcvctxiofmojnhgasqtmjwh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765223990.7576876-564-104557448869603/AnsiballZ_edpm_container_manage.py'
Dec 08 19:59:51 compute-0 sudo[95317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:51 compute-0 python3[95319]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 19:59:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:59:51 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:59:51 compute-0 podman[95357]: 2025-12-08 19:59:51.869056515 +0000 UTC m=+0.078852380 container create b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 08 19:59:51 compute-0 podman[95357]: 2025-12-08 19:59:51.831359941 +0000 UTC m=+0.041155816 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 08 19:59:51 compute-0 python3[95319]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 08 19:59:52 compute-0 sudo[95317]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:52 compute-0 sudo[95543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqbmlykqedtbeamwmbggblyspksifzop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223992.1801448-572-213560454516588/AnsiballZ_stat.py'
Dec 08 19:59:52 compute-0 sudo[95543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:52 compute-0 python3.9[95545]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:59:52 compute-0 sudo[95543]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:52 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 08 19:59:53 compute-0 sudo[95697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdmvdowhifzubdmlawyvwfmlyuhvouxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223992.9437788-581-82584369156386/AnsiballZ_file.py'
Dec 08 19:59:53 compute-0 sudo[95697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:53 compute-0 python3.9[95699]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:53 compute-0 sudo[95697]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:53 compute-0 sudo[95773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkndefgkmsqsgstjcsylqrfabcufbrxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223992.9437788-581-82584369156386/AnsiballZ_stat.py'
Dec 08 19:59:53 compute-0 sudo[95773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:53 compute-0 python3.9[95775]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 19:59:53 compute-0 sudo[95773]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:54 compute-0 sudo[95924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkzwbvqtaghyiqrdclhuhxagticxtgpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223993.9266946-581-157315501488216/AnsiballZ_copy.py'
Dec 08 19:59:54 compute-0 sudo[95924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:54 compute-0 python3.9[95926]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765223993.9266946-581-157315501488216/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 19:59:54 compute-0 sudo[95924]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:54 compute-0 sudo[96000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulenaavgpxhauaxpvknpxfnbzbqkrfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223993.9266946-581-157315501488216/AnsiballZ_systemd.py'
Dec 08 19:59:54 compute-0 sudo[96000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:55 compute-0 python3.9[96002]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 19:59:55 compute-0 systemd[1]: Reloading.
Dec 08 19:59:55 compute-0 systemd-sysv-generator[96031]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:59:55 compute-0 systemd-rc-local-generator[96028]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:59:55 compute-0 sudo[96000]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:55 compute-0 sudo[96112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysrcwpokwekgxnhfjnwqiftrtmlcslig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223993.9266946-581-157315501488216/AnsiballZ_systemd.py'
Dec 08 19:59:55 compute-0 sudo[96112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:56 compute-0 python3.9[96114]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 19:59:56 compute-0 systemd[1]: Reloading.
Dec 08 19:59:56 compute-0 systemd-sysv-generator[96148]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:59:56 compute-0 systemd-rc-local-generator[96144]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:59:56 compute-0 systemd[1]: Starting ovn_controller container...
Dec 08 19:59:56 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 08 19:59:56 compute-0 systemd[1]: Started libcrun container.
Dec 08 19:59:56 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1267ff7e48a4997e28048b1ee0f1190675df365f94cb867e9942d987817238f/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 08 19:59:56 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1.
Dec 08 19:59:56 compute-0 podman[96155]: 2025-12-08 19:59:56.616012819 +0000 UTC m=+0.143676904 container init b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 19:59:56 compute-0 ovn_controller[96170]: + sudo -E kolla_set_configs
Dec 08 19:59:56 compute-0 podman[96155]: 2025-12-08 19:59:56.640511836 +0000 UTC m=+0.168175871 container start b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 08 19:59:56 compute-0 edpm-start-podman-container[96155]: ovn_controller
Dec 08 19:59:56 compute-0 systemd[1]: Created slice User Slice of UID 0.
Dec 08 19:59:56 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 08 19:59:56 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 08 19:59:56 compute-0 systemd[1]: Starting User Manager for UID 0...
Dec 08 19:59:56 compute-0 edpm-start-podman-container[96154]: Creating additional drop-in dependency for "ovn_controller" (b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1)
Dec 08 19:59:56 compute-0 systemd[96207]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Dec 08 19:59:56 compute-0 podman[96176]: 2025-12-08 19:59:56.763942657 +0000 UTC m=+0.111728541 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 08 19:59:56 compute-0 systemd[1]: b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1-310f16cadb6234e2.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 19:59:56 compute-0 systemd[1]: b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1-310f16cadb6234e2.service: Failed with result 'exit-code'.
Dec 08 19:59:56 compute-0 systemd[1]: Reloading.
Dec 08 19:59:56 compute-0 systemd-sysv-generator[96257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 19:59:56 compute-0 systemd-rc-local-generator[96251]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 19:59:56 compute-0 systemd[96207]: Queued start job for default target Main User Target.
Dec 08 19:59:56 compute-0 systemd[96207]: Created slice User Application Slice.
Dec 08 19:59:56 compute-0 systemd[96207]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 08 19:59:56 compute-0 systemd[96207]: Started Daily Cleanup of User's Temporary Directories.
Dec 08 19:59:56 compute-0 systemd[96207]: Reached target Paths.
Dec 08 19:59:56 compute-0 systemd[96207]: Reached target Timers.
Dec 08 19:59:56 compute-0 systemd[96207]: Starting D-Bus User Message Bus Socket...
Dec 08 19:59:56 compute-0 systemd[96207]: Starting Create User's Volatile Files and Directories...
Dec 08 19:59:56 compute-0 systemd[96207]: Listening on D-Bus User Message Bus Socket.
Dec 08 19:59:56 compute-0 systemd[96207]: Finished Create User's Volatile Files and Directories.
Dec 08 19:59:56 compute-0 systemd[96207]: Reached target Sockets.
Dec 08 19:59:56 compute-0 systemd[96207]: Reached target Basic System.
Dec 08 19:59:56 compute-0 systemd[96207]: Reached target Main User Target.
Dec 08 19:59:56 compute-0 systemd[96207]: Startup finished in 147ms.
Dec 08 19:59:57 compute-0 systemd[1]: Started User Manager for UID 0.
Dec 08 19:59:57 compute-0 systemd[1]: Started ovn_controller container.
Dec 08 19:59:57 compute-0 systemd[1]: Started Session c1 of User root.
Dec 08 19:59:57 compute-0 sudo[96112]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:57 compute-0 ovn_controller[96170]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 19:59:57 compute-0 ovn_controller[96170]: INFO:__main__:Validating config file
Dec 08 19:59:57 compute-0 ovn_controller[96170]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 19:59:57 compute-0 ovn_controller[96170]: INFO:__main__:Writing out command to execute
Dec 08 19:59:57 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: ++ cat /run_command
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + ARGS=
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + sudo kolla_copy_cacerts
Dec 08 19:59:57 compute-0 systemd[1]: Started Session c2 of User root.
Dec 08 19:59:57 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + [[ ! -n '' ]]
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + . kolla_extend_start
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 08 19:59:57 compute-0 ovn_controller[96170]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + umask 0022
Dec 08 19:59:57 compute-0 ovn_controller[96170]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.1977] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.1984] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <warn>  [1765223997.1987] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.1994] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.2000] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.2003] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 08 19:59:57 compute-0 kernel: br-int: entered promiscuous mode
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00011|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00012|features|INFO|OVS Feature: ct_flush, state: supported
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00013|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00014|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00015|main|INFO|OVS feature set changed, force recompute.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00016|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00018|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00019|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00020|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00022|main|INFO|OVS feature set changed, force recompute.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 08 19:59:57 compute-0 ovn_controller[96170]: 2025-12-08T19:59:57Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.2335] manager: (ovn-b67045-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 08 19:59:57 compute-0 systemd-udevd[96323]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:59:57 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.2675] device (genev_sys_6081): carrier: link connected
Dec 08 19:59:57 compute-0 NetworkManager[56229]: <info>  [1765223997.2684] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 08 19:59:57 compute-0 systemd-udevd[96325]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 19:59:57 compute-0 sudo[96431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zyfsrsrokdfxpawyccizhtkdzucgxnko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223997.2527294-609-10417317994926/AnsiballZ_command.py'
Dec 08 19:59:57 compute-0 sudo[96431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:57 compute-0 python3.9[96433]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:57 compute-0 ovs-vsctl[96434]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 08 19:59:57 compute-0 sudo[96431]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:58 compute-0 sudo[96584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxjhdgwtdxzgtqrmkpidstwmzlswteul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223997.9539118-617-60945321460284/AnsiballZ_command.py'
Dec 08 19:59:58 compute-0 sudo[96584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:58 compute-0 python3.9[96586]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:58 compute-0 ovs-vsctl[96588]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 08 19:59:58 compute-0 sudo[96584]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:59 compute-0 sudo[96739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ildfbphhkyxhtmjmoapeemzmdkunpgcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765223998.8971367-631-261184566770452/AnsiballZ_command.py'
Dec 08 19:59:59 compute-0 sudo[96739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 19:59:59 compute-0 python3.9[96741]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 19:59:59 compute-0 ovs-vsctl[96742]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 08 19:59:59 compute-0 sudo[96739]: pam_unix(sudo:session): session closed for user root
Dec 08 19:59:59 compute-0 sshd-session[85668]: Connection closed by 192.168.122.30 port 56428
Dec 08 19:59:59 compute-0 sshd-session[85665]: pam_unix(sshd:session): session closed for user zuul
Dec 08 19:59:59 compute-0 systemd-logind[793]: Session 19 logged out. Waiting for processes to exit.
Dec 08 19:59:59 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Dec 08 19:59:59 compute-0 systemd[1]: session-19.scope: Consumed 48.034s CPU time.
Dec 08 19:59:59 compute-0 systemd-logind[793]: Removed session 19.
Dec 08 20:00:06 compute-0 sshd-session[96769]: Accepted publickey for zuul from 192.168.122.30 port 38874 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:00:06 compute-0 systemd-logind[793]: New session 21 of user zuul.
Dec 08 20:00:06 compute-0 systemd[1]: Started Session 21 of User zuul.
Dec 08 20:00:06 compute-0 sshd-session[96769]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:00:07 compute-0 python3.9[96922]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:00:07 compute-0 systemd[1]: Stopping User Manager for UID 0...
Dec 08 20:00:07 compute-0 systemd[96207]: Activating special unit Exit the Session...
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped target Main User Target.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped target Basic System.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped target Paths.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped target Sockets.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped target Timers.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 08 20:00:07 compute-0 systemd[96207]: Closed D-Bus User Message Bus Socket.
Dec 08 20:00:07 compute-0 systemd[96207]: Stopped Create User's Volatile Files and Directories.
Dec 08 20:00:07 compute-0 systemd[96207]: Removed slice User Application Slice.
Dec 08 20:00:07 compute-0 systemd[96207]: Reached target Shutdown.
Dec 08 20:00:07 compute-0 systemd[96207]: Finished Exit the Session.
Dec 08 20:00:07 compute-0 systemd[96207]: Reached target Exit the Session.
Dec 08 20:00:07 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Dec 08 20:00:07 compute-0 systemd[1]: Stopped User Manager for UID 0.
Dec 08 20:00:07 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 08 20:00:07 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 08 20:00:07 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 08 20:00:07 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 08 20:00:07 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Dec 08 20:00:08 compute-0 sudo[97078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbbccttwryjyfhkcahwpkykbjbspajjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224007.7469323-34-131669126899582/AnsiballZ_file.py'
Dec 08 20:00:08 compute-0 sudo[97078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:08 compute-0 python3.9[97080]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:08 compute-0 sudo[97078]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:08 compute-0 sudo[97230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voicntbyuumjccngwjdsytypwlmaaxbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224008.5625603-34-39671766142820/AnsiballZ_file.py'
Dec 08 20:00:08 compute-0 sudo[97230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:09 compute-0 python3.9[97232]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:09 compute-0 sudo[97230]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:09 compute-0 sudo[97382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygjgsodmuvvcwyxltrnstafksdhciacq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224009.274688-34-74480823796759/AnsiballZ_file.py'
Dec 08 20:00:09 compute-0 sudo[97382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:09 compute-0 python3.9[97384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:09 compute-0 sshd-session[96767]: Received disconnect from 45.78.228.32 port 51688:11: Bye Bye [preauth]
Dec 08 20:00:09 compute-0 sshd-session[96767]: Disconnected from authenticating user root 45.78.228.32 port 51688 [preauth]
Dec 08 20:00:09 compute-0 sudo[97382]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:10 compute-0 sudo[97534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjpqxhywaadjwpfyjsbvbapecsapcixp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224010.0683968-34-144790024999759/AnsiballZ_file.py'
Dec 08 20:00:10 compute-0 sudo[97534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:10 compute-0 python3.9[97536]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:10 compute-0 sudo[97534]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:10 compute-0 sudo[97686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovmmrfeztixmehdvojioaygrhjmchcse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224010.7300472-34-70645699761211/AnsiballZ_file.py'
Dec 08 20:00:10 compute-0 sudo[97686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:11 compute-0 python3.9[97688]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:11 compute-0 sudo[97686]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:12 compute-0 python3.9[97838]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:00:12 compute-0 sshd[1006]: Timeout before authentication for connection from 101.47.160.247 to 38.102.83.66, pid = 78156
Dec 08 20:00:12 compute-0 sudo[97988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kojtgujbjqwaynzwiilnxuutzxgdjybj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224012.2149541-78-157815138423687/AnsiballZ_seboolean.py'
Dec 08 20:00:12 compute-0 sudo[97988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:12 compute-0 python3.9[97990]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 08 20:00:13 compute-0 sudo[97988]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:14 compute-0 python3.9[98140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:15 compute-0 python3.9[98261]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224013.862111-86-208993116023849/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:15 compute-0 python3.9[98412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:16 compute-0 python3.9[98533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224015.4278975-101-171400890278269/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:17 compute-0 sudo[98683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kggleotcerrktgsvmdoqmpwyjzosukbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224016.8297985-118-50594789869225/AnsiballZ_setup.py'
Dec 08 20:00:17 compute-0 sudo[98683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:17 compute-0 python3.9[98685]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 20:00:17 compute-0 sudo[98683]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:18 compute-0 sudo[98767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkmxbdjqkanvvhumgsemrolnjyfitugm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224016.8297985-118-50594789869225/AnsiballZ_dnf.py'
Dec 08 20:00:18 compute-0 sudo[98767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:18 compute-0 python3.9[98769]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 20:00:19 compute-0 sudo[98767]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:20 compute-0 sudo[98920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgahhdmqyronhsdowbhahpfhgfeexhdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224019.9858685-130-165518992794839/AnsiballZ_systemd.py'
Dec 08 20:00:20 compute-0 sudo[98920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:21 compute-0 python3.9[98922]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:00:21 compute-0 sudo[98920]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:21 compute-0 sshd-session[98923]: Invalid user svn from 159.223.8.81 port 52474
Dec 08 20:00:21 compute-0 sshd-session[98923]: Received disconnect from 159.223.8.81 port 52474:11: Bye Bye [preauth]
Dec 08 20:00:21 compute-0 sshd-session[98923]: Disconnected from invalid user svn 159.223.8.81 port 52474 [preauth]
Dec 08 20:00:22 compute-0 python3.9[99077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:22 compute-0 python3.9[99198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224021.5891504-138-169331026754421/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:23 compute-0 python3.9[99348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:23 compute-0 python3.9[99469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224022.7099442-138-132013028994393/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:24 compute-0 python3.9[99619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:25 compute-0 python3.9[99740]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224024.4680953-182-227035561214913/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:25 compute-0 python3.9[99890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:26 compute-0 python3.9[100011]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224025.5684652-182-159447360293372/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:26 compute-0 ovn_controller[96170]: 2025-12-08T20:00:26Z|00025|memory|INFO|16256 kB peak resident set size after 29.8 seconds
Dec 08 20:00:26 compute-0 ovn_controller[96170]: 2025-12-08T20:00:26Z|00026|memory|INFO|idl-cells-OVN_Southbound:239 idl-cells-Open_vSwitch:471 ofctrl_desired_flow_usage-KB:5 ofctrl_installed_flow_usage-KB:4 ofctrl_sb_flow_ref_usage-KB:2
Dec 08 20:00:27 compute-0 podman[100135]: 2025-12-08 20:00:27.021578822 +0000 UTC m=+0.112161146 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 08 20:00:27 compute-0 python3.9[100171]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:00:27 compute-0 sudo[100336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzyxuerksoxymydglcsvkqatvkvwzvhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224027.6343913-220-65287692786766/AnsiballZ_file.py'
Dec 08 20:00:27 compute-0 sudo[100336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:28 compute-0 python3.9[100338]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:28 compute-0 sudo[100336]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:28 compute-0 sudo[100488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwlfwylyvfscsmudxwpzizlrkksigpph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224028.3034825-228-17713599731666/AnsiballZ_stat.py'
Dec 08 20:00:28 compute-0 sudo[100488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:28 compute-0 python3.9[100490]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:28 compute-0 sudo[100488]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:28 compute-0 sudo[100566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfykthqolowspsayyoisxaddhyuzkalc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224028.3034825-228-17713599731666/AnsiballZ_file.py'
Dec 08 20:00:28 compute-0 sudo[100566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:29 compute-0 python3.9[100568]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:29 compute-0 sudo[100566]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:29 compute-0 sudo[100718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gttsfkpuebyxhgyhswmpqrbxnpouvwau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224029.3137076-228-70380533898062/AnsiballZ_stat.py'
Dec 08 20:00:29 compute-0 sudo[100718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:29 compute-0 python3.9[100720]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:29 compute-0 sudo[100718]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:29 compute-0 sudo[100796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-degdrajtlrtedsrttfidjnklnmytobpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224029.3137076-228-70380533898062/AnsiballZ_file.py'
Dec 08 20:00:29 compute-0 sudo[100796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:30 compute-0 python3.9[100798]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:30 compute-0 sudo[100796]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:30 compute-0 sudo[100948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jarjmyjnnxytfstldxtvupawbcnhtumq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224030.3748026-251-106596447039363/AnsiballZ_file.py'
Dec 08 20:00:30 compute-0 sudo[100948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:30 compute-0 python3.9[100950]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:30 compute-0 sudo[100948]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:31 compute-0 sudo[101100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmhrsqnatrvrqotcfyyodjkwwhygabgz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224031.0225687-259-58460492413076/AnsiballZ_stat.py'
Dec 08 20:00:31 compute-0 sudo[101100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:31 compute-0 python3.9[101102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:31 compute-0 sudo[101100]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:31 compute-0 sudo[101178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byviripksulutnmhkrjjkouzaigimoid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224031.0225687-259-58460492413076/AnsiballZ_file.py'
Dec 08 20:00:31 compute-0 sudo[101178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:32 compute-0 python3.9[101180]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:32 compute-0 sudo[101178]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:32 compute-0 sudo[101330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfilrbphiwnrbttemqarcwmyxmtngeej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224032.2770584-271-270800539637621/AnsiballZ_stat.py'
Dec 08 20:00:32 compute-0 sudo[101330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:32 compute-0 python3.9[101332]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:32 compute-0 sudo[101330]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:32 compute-0 sudo[101408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewnrtuqzxcbymvwhktgbxmzjhrvpcweq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224032.2770584-271-270800539637621/AnsiballZ_file.py'
Dec 08 20:00:32 compute-0 sudo[101408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:33 compute-0 python3.9[101410]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:33 compute-0 sudo[101408]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:33 compute-0 sudo[101560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbgssilxaxmrieykwrtwoobitodaienb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224033.316828-283-30726055365253/AnsiballZ_systemd.py'
Dec 08 20:00:33 compute-0 sudo[101560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:33 compute-0 python3.9[101562]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:00:33 compute-0 systemd[1]: Reloading.
Dec 08 20:00:34 compute-0 systemd-rc-local-generator[101591]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:00:34 compute-0 systemd-sysv-generator[101595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:00:34 compute-0 sudo[101560]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:34 compute-0 sudo[101750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxgsnbmgbdrkgehpjuhpqnnvatnntoam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224034.4566464-291-68527021506673/AnsiballZ_stat.py'
Dec 08 20:00:34 compute-0 sudo[101750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:34 compute-0 python3.9[101752]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:34 compute-0 sudo[101750]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:35 compute-0 sudo[101828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baxhtqizymdkylvrlrsgsefssatzpaab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224034.4566464-291-68527021506673/AnsiballZ_file.py'
Dec 08 20:00:35 compute-0 sudo[101828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:35 compute-0 sshd[1006]: drop connection #1 from [101.47.160.247]:44798 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 20:00:35 compute-0 python3.9[101830]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:35 compute-0 sudo[101828]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:35 compute-0 sudo[101980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywsandlocokhxkfggodrmznipkyyougj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224035.5319147-303-114313452793738/AnsiballZ_stat.py'
Dec 08 20:00:35 compute-0 sudo[101980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:35 compute-0 python3.9[101982]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:36 compute-0 sudo[101980]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:36 compute-0 sudo[102058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mstygsykkqnmgxyhtcrftumvlaplxwcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224035.5319147-303-114313452793738/AnsiballZ_file.py'
Dec 08 20:00:36 compute-0 sudo[102058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:36 compute-0 python3.9[102060]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:36 compute-0 sudo[102058]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:37 compute-0 sudo[102210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqkmeupgkudrvlieyoawbklvnyyaxmcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224036.7429695-315-113603965669442/AnsiballZ_systemd.py'
Dec 08 20:00:37 compute-0 sudo[102210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:37 compute-0 python3.9[102212]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:00:37 compute-0 systemd[1]: Reloading.
Dec 08 20:00:37 compute-0 systemd-rc-local-generator[102239]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:00:37 compute-0 systemd-sysv-generator[102243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:00:37 compute-0 systemd[1]: Starting Create netns directory...
Dec 08 20:00:37 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 08 20:00:37 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 08 20:00:37 compute-0 systemd[1]: Finished Create netns directory.
Dec 08 20:00:37 compute-0 sudo[102210]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:38 compute-0 sudo[102403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmmepigpfratbvjzgnsrypgwffuepcuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224037.9033194-325-46073657943826/AnsiballZ_file.py'
Dec 08 20:00:38 compute-0 sudo[102403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:38 compute-0 python3.9[102405]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:38 compute-0 sudo[102403]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:38 compute-0 sudo[102557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwuxpfnytqhkywgcskkddijqfbccloc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224038.4965594-333-251023298384242/AnsiballZ_stat.py'
Dec 08 20:00:38 compute-0 sudo[102557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:38 compute-0 sshd-session[102482]: Invalid user wwwroot from 172.190.42.55 port 56226
Dec 08 20:00:38 compute-0 sshd-session[102482]: Received disconnect from 172.190.42.55 port 56226:11: Bye Bye [preauth]
Dec 08 20:00:38 compute-0 sshd-session[102482]: Disconnected from invalid user wwwroot 172.190.42.55 port 56226 [preauth]
Dec 08 20:00:38 compute-0 python3.9[102559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:38 compute-0 sudo[102557]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:39 compute-0 sudo[102680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trkovixnwqxmibyaozowkiryopozwzfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224038.4965594-333-251023298384242/AnsiballZ_copy.py'
Dec 08 20:00:39 compute-0 sudo[102680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:39 compute-0 python3.9[102682]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224038.4965594-333-251023298384242/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:39 compute-0 sudo[102680]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:40 compute-0 sudo[102832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-miyoesmcimragibealjpdlacadevbwda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224039.8271666-350-157766927813292/AnsiballZ_file.py'
Dec 08 20:00:40 compute-0 sudo[102832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:40 compute-0 python3.9[102834]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:00:40 compute-0 sudo[102832]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:41 compute-0 sudo[102984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnplijxfpejdofhfacwqhumgbqumgoay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224040.7588487-358-168269901831081/AnsiballZ_stat.py'
Dec 08 20:00:41 compute-0 sudo[102984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:41 compute-0 python3.9[102986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:00:41 compute-0 sudo[102984]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:41 compute-0 sudo[103107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrozivpuxrtegzezfwoigzmufbtsfoem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224040.7588487-358-168269901831081/AnsiballZ_copy.py'
Dec 08 20:00:41 compute-0 sudo[103107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:41 compute-0 python3.9[103109]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224040.7588487-358-168269901831081/.source.json _original_basename=.r12sdzjy follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:41 compute-0 sudo[103107]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:42 compute-0 sudo[103259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohvriydbulxpeiflgotqgpivkrssmglq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224042.003612-373-223825572328169/AnsiballZ_file.py'
Dec 08 20:00:42 compute-0 sudo[103259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:42 compute-0 python3.9[103261]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:42 compute-0 sudo[103259]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:43 compute-0 sudo[103411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlleprlppvrvinhawtratiydagexbocr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224042.7657409-381-20509093668411/AnsiballZ_stat.py'
Dec 08 20:00:43 compute-0 sudo[103411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:43 compute-0 sudo[103411]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:43 compute-0 sudo[103534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtjxnqauabtqhizdfszerjkyaoesezoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224042.7657409-381-20509093668411/AnsiballZ_copy.py'
Dec 08 20:00:43 compute-0 sudo[103534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:43 compute-0 sudo[103534]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:44 compute-0 sudo[103686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzyptdudaehunnlblwddygidgytxtbht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224044.3848932-398-259017632857149/AnsiballZ_container_config_data.py'
Dec 08 20:00:44 compute-0 sudo[103686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:45 compute-0 python3.9[103688]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 08 20:00:45 compute-0 sudo[103686]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:45 compute-0 sudo[103838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwbusnhnfwvycvpvytdhvvlteiccjjqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224045.296098-407-225742578994063/AnsiballZ_container_config_hash.py'
Dec 08 20:00:45 compute-0 sudo[103838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:45 compute-0 python3.9[103840]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:00:46 compute-0 sudo[103838]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:46 compute-0 sudo[103990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wargmtahwdvydsqxyjkpkmfoqqurodig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224046.2727768-416-120384002388820/AnsiballZ_podman_container_info.py'
Dec 08 20:00:46 compute-0 sudo[103990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:46 compute-0 python3.9[103992]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 08 20:00:47 compute-0 sudo[103990]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:48 compute-0 sudo[104168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jctikebabkhotqmysfcgmlftjxfmfapb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224047.4965484-429-186239946049921/AnsiballZ_edpm_container_manage.py'
Dec 08 20:00:48 compute-0 sudo[104168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:48 compute-0 python3[104170]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:00:48 compute-0 podman[104204]: 2025-12-08 20:00:48.624034115 +0000 UTC m=+0.024304961 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:00:48 compute-0 podman[104204]: 2025-12-08 20:00:48.735670444 +0000 UTC m=+0.135941270 container create 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:00:48 compute-0 python3[104170]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:00:48 compute-0 sudo[104168]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:49 compute-0 sudo[104391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grbegkscgwbzayrmmpthodxaxrmvpomv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224048.9982247-437-81029996317636/AnsiballZ_stat.py'
Dec 08 20:00:49 compute-0 sudo[104391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:49 compute-0 python3.9[104393]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:00:49 compute-0 sudo[104391]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:49 compute-0 sudo[104545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzkhxbuwvchnblkxllytdwpcndpwtdni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224049.707658-446-65459360144962/AnsiballZ_file.py'
Dec 08 20:00:49 compute-0 sudo[104545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:50 compute-0 python3.9[104547]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:50 compute-0 sudo[104545]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:50 compute-0 sudo[104621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jazrozmruwcnrhpbfcdqepgrdjmpdypv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224049.707658-446-65459360144962/AnsiballZ_stat.py'
Dec 08 20:00:50 compute-0 sudo[104621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:50 compute-0 python3.9[104623]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:00:50 compute-0 sudo[104621]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:51 compute-0 sudo[104772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgmmxlykhpwzcwpevcmpwdclgrqafkvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224050.6344473-446-58023373534767/AnsiballZ_copy.py'
Dec 08 20:00:51 compute-0 sudo[104772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:51 compute-0 python3.9[104774]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224050.6344473-446-58023373534767/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:00:51 compute-0 sudo[104772]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:51 compute-0 sudo[104848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msqrsqmjdxuklznrpgngislmofggcziv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224050.6344473-446-58023373534767/AnsiballZ_systemd.py'
Dec 08 20:00:51 compute-0 sudo[104848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:51 compute-0 python3.9[104850]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:00:51 compute-0 systemd[1]: Reloading.
Dec 08 20:00:51 compute-0 systemd-rc-local-generator[104875]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:00:51 compute-0 systemd-sysv-generator[104880]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:00:51 compute-0 sudo[104848]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:52 compute-0 sudo[104959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhjzdihrwoyrmyuibisctqwoqpjlxzeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224050.6344473-446-58023373534767/AnsiballZ_systemd.py'
Dec 08 20:00:52 compute-0 sudo[104959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:00:52 compute-0 python3.9[104961]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:00:52 compute-0 systemd[1]: Reloading.
Dec 08 20:00:52 compute-0 systemd-rc-local-generator[104992]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:00:52 compute-0 systemd-sysv-generator[104995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:00:52 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Dec 08 20:00:52 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:00:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d2005d5cfb91092376e9fe555a7125c86179441fefc546e1a668096f92f857b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 08 20:00:52 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d2005d5cfb91092376e9fe555a7125c86179441fefc546e1a668096f92f857b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:00:53 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398.
Dec 08 20:00:53 compute-0 podman[105003]: 2025-12-08 20:00:53.058705818 +0000 UTC m=+0.193947860 container init 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + sudo -E kolla_set_configs
Dec 08 20:00:53 compute-0 podman[105003]: 2025-12-08 20:00:53.090564771 +0000 UTC m=+0.225806803 container start 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:00:53 compute-0 edpm-start-podman-container[105003]: ovn_metadata_agent
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Validating config file
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Copying service configuration files
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Writing out command to execute
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: ++ cat /run_command
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + CMD=neutron-ovn-metadata-agent
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + ARGS=
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + sudo kolla_copy_cacerts
Dec 08 20:00:53 compute-0 edpm-start-podman-container[105002]: Creating additional drop-in dependency for "ovn_metadata_agent" (2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398)
Dec 08 20:00:53 compute-0 podman[105026]: 2025-12-08 20:00:53.160667192 +0000 UTC m=+0.057893318 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 08 20:00:53 compute-0 systemd[1]: Reloading.
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + [[ ! -n '' ]]
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + . kolla_extend_start
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: Running command: 'neutron-ovn-metadata-agent'
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + umask 0022
Dec 08 20:00:53 compute-0 ovn_metadata_agent[105019]: + exec neutron-ovn-metadata-agent
Dec 08 20:00:53 compute-0 systemd-sysv-generator[105096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:00:53 compute-0 systemd-rc-local-generator[105093]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:00:53 compute-0 systemd[1]: Started ovn_metadata_agent container.
Dec 08 20:00:53 compute-0 sudo[104959]: pam_unix(sudo:session): session closed for user root
Dec 08 20:00:53 compute-0 sshd-session[96772]: Connection closed by 192.168.122.30 port 38874
Dec 08 20:00:53 compute-0 sshd-session[96769]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:00:53 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Dec 08 20:00:53 compute-0 systemd[1]: session-21.scope: Consumed 33.717s CPU time.
Dec 08 20:00:53 compute-0 systemd-logind[793]: Session 21 logged out. Waiting for processes to exit.
Dec 08 20:00:53 compute-0 systemd-logind[793]: Removed session 21.
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.933 105024 INFO neutron.common.config [-] Logging enabled!
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.934 105024 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.934 105024 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.934 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.935 105024 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.936 105024 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.937 105024 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.938 105024 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.939 105024 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.940 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.941 105024 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.942 105024 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.943 105024 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.944 105024 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.945 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.946 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.947 105024 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.948 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.949 105024 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.950 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.951 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.952 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.953 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.954 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.955 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.956 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.957 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.958 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.959 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.960 105024 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.961 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.962 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.963 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.964 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.965 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.966 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.967 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.968 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.969 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.970 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.971 105024 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.982 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.982 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.982 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.982 105024 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.983 105024 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Dec 08 20:00:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:54.998 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 7a8539fb-8779-42f7-8fa8-222db61ea5ae (UUID: 7a8539fb-8779-42f7-8fa8-222db61ea5ae) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.026 105024 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.026 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.027 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.027 105024 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.031 105024 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.038 105024 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.044 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '7a8539fb-8779-42f7-8fa8-222db61ea5ae'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], external_ids={}, name=7a8539fb-8779-42f7-8fa8-222db61ea5ae, nb_cfg_timestamp=1765224005225, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.045 105024 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f4da7d05130>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.046 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.046 105024 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.047 105024 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.047 105024 INFO oslo_service.service [-] Starting 1 workers
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.052 105024 DEBUG oslo_service.service [-] Started child 105131 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.056 105024 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpw4j9mjil/privsep.sock']
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.056 105131 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-162619'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.080 105131 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.081 105131 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.081 105131 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.085 105131 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.090 105131 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.096 105131 INFO eventlet.wsgi.server [-] (105131) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Dec 08 20:00:55 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.753 105024 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.753 105024 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw4j9mjil/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.619 105136 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.624 105136 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.629 105136 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.630 105136 INFO oslo.privsep.daemon [-] privsep daemon running as pid 105136
Dec 08 20:00:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:55.756 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[3b2411af-636e-4b4d-88aa-5bacaa99cc32]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.282 105136 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.282 105136 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.282 105136 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.817 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[b75a9ab4-c54e-40e2-80d3-c68d209723c1]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.820 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, column=external_ids, values=({'neutron:ovn-metadata-id': '76af27b4-bc42-5759-96c0-0e6d58e715f8'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.831 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.839 105024 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.839 105024 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.840 105024 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.841 105024 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.841 105024 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.841 105024 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.841 105024 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.845 105024 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.846 105024 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.847 105024 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.848 105024 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.848 105024 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.848 105024 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.848 105024 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.848 105024 DEBUG oslo_service.service [-] host                           = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.849 105024 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.850 105024 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.851 105024 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.852 105024 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.853 105024 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.854 105024 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.855 105024 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.856 105024 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.857 105024 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.858 105024 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.859 105024 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.860 105024 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.861 105024 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.862 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.863 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.864 105024 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.865 105024 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.866 105024 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.867 105024 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.868 105024 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.869 105024 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.870 105024 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.871 105024 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.872 105024 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.873 105024 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.874 105024 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.875 105024 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.876 105024 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.877 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.878 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.879 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.880 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:00:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:00:56.881 105024 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 08 20:00:57 compute-0 podman[105141]: 2025-12-08 20:00:57.551167304 +0000 UTC m=+0.112967034 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 08 20:00:59 compute-0 sshd-session[105169]: Accepted publickey for zuul from 192.168.122.30 port 60506 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:00:59 compute-0 systemd-logind[793]: New session 22 of user zuul.
Dec 08 20:00:59 compute-0 systemd[1]: Started Session 22 of User zuul.
Dec 08 20:00:59 compute-0 sshd-session[105169]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:01:01 compute-0 python3.9[105322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:01:01 compute-0 CROND[105328]: (root) CMD (run-parts /etc/cron.hourly)
Dec 08 20:01:01 compute-0 run-parts[105331]: (/etc/cron.hourly) starting 0anacron
Dec 08 20:01:01 compute-0 anacron[105339]: Anacron started on 2025-12-08
Dec 08 20:01:01 compute-0 anacron[105339]: Will run job `cron.daily' in 22 min.
Dec 08 20:01:01 compute-0 anacron[105339]: Will run job `cron.weekly' in 42 min.
Dec 08 20:01:01 compute-0 anacron[105339]: Will run job `cron.monthly' in 62 min.
Dec 08 20:01:01 compute-0 anacron[105339]: Jobs will be executed sequentially
Dec 08 20:01:01 compute-0 run-parts[105341]: (/etc/cron.hourly) finished 0anacron
Dec 08 20:01:01 compute-0 CROND[105327]: (root) CMDEND (run-parts /etc/cron.hourly)
Dec 08 20:01:02 compute-0 sudo[105491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efpyliajbbkfvcnhusfxbtadgaarcqkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224061.8038006-34-103597422418335/AnsiballZ_command.py'
Dec 08 20:01:02 compute-0 sudo[105491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:02 compute-0 python3.9[105493]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:02 compute-0 sudo[105491]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:03 compute-0 sudo[105656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eoithmwpveparnojmgnvuqpyiddkejhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224062.8021429-45-256939332212611/AnsiballZ_systemd_service.py'
Dec 08 20:01:03 compute-0 sudo[105656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:03 compute-0 python3.9[105658]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:01:03 compute-0 systemd[1]: Reloading.
Dec 08 20:01:03 compute-0 systemd-rc-local-generator[105684]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:01:03 compute-0 systemd-sysv-generator[105689]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:01:03 compute-0 sudo[105656]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:04 compute-0 python3.9[105842]: ansible-ansible.builtin.service_facts Invoked
Dec 08 20:01:04 compute-0 network[105859]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 20:01:04 compute-0 network[105860]: 'network-scripts' will be removed from distribution in near future.
Dec 08 20:01:04 compute-0 network[105861]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 20:01:09 compute-0 sudo[106120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iougalygbqkikvrzgxtfiixwnqhfiyqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224068.704663-64-1671157928920/AnsiballZ_systemd_service.py'
Dec 08 20:01:09 compute-0 sudo[106120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:09 compute-0 python3.9[106122]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:09 compute-0 sudo[106120]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:09 compute-0 sudo[106273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knfynlzbpieyzbygrykkpvuhhguruaif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224069.5036733-64-134515093599589/AnsiballZ_systemd_service.py'
Dec 08 20:01:09 compute-0 sudo[106273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:10 compute-0 python3.9[106275]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:10 compute-0 sudo[106273]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:10 compute-0 sudo[106426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rscgjeluxokgyezztyezmbjfowjswlqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224070.3073592-64-186287625140517/AnsiballZ_systemd_service.py'
Dec 08 20:01:10 compute-0 sudo[106426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:10 compute-0 python3.9[106428]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:10 compute-0 sudo[106426]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:11 compute-0 sudo[106579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zefwmhilvlxcqkywgceyeuupcdwhdshr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224071.0819485-64-127028651289518/AnsiballZ_systemd_service.py'
Dec 08 20:01:11 compute-0 sudo[106579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:11 compute-0 python3.9[106581]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:11 compute-0 sudo[106579]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:12 compute-0 sudo[106732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-byrwlrtaqyruusuunwvmxjpzznpjplbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224072.0092068-64-154961080211438/AnsiballZ_systemd_service.py'
Dec 08 20:01:12 compute-0 sudo[106732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:12 compute-0 python3.9[106734]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:12 compute-0 sudo[106732]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:13 compute-0 sudo[106885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utledpqdipfrhrcyaauvlskqillioibn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224072.7345877-64-155556725898486/AnsiballZ_systemd_service.py'
Dec 08 20:01:13 compute-0 sudo[106885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:13 compute-0 python3.9[106887]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:13 compute-0 sudo[106885]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:13 compute-0 sudo[107038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnpmoekyxmsdbwdxkiekeukzmfhdurde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224073.4579268-64-109187845487676/AnsiballZ_systemd_service.py'
Dec 08 20:01:13 compute-0 sudo[107038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:14 compute-0 python3.9[107040]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:01:14 compute-0 sudo[107038]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:15 compute-0 sudo[107191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixisjvtgikfvvrygggghwmtobizrjpia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224074.71287-116-100359883051111/AnsiballZ_file.py'
Dec 08 20:01:15 compute-0 sudo[107191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:15 compute-0 python3.9[107193]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:15 compute-0 sudo[107191]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:16 compute-0 sudo[107343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dphhkdvenvcwffkttfdhtjfqpzbqrper ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224075.5064366-116-212083572964908/AnsiballZ_file.py'
Dec 08 20:01:16 compute-0 sudo[107343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:16 compute-0 python3.9[107345]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:16 compute-0 sudo[107343]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:16 compute-0 sudo[107495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akxnotulmkojjzfdksaoxjtxtwnjrqrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224076.5095198-116-139571500630332/AnsiballZ_file.py'
Dec 08 20:01:16 compute-0 sudo[107495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:16 compute-0 python3.9[107497]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:17 compute-0 sudo[107495]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:17 compute-0 sudo[107647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tesviecinrrfvnelvzkrxsdknicqojsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224077.1403606-116-244444075939351/AnsiballZ_file.py'
Dec 08 20:01:17 compute-0 sudo[107647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:17 compute-0 python3.9[107649]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:17 compute-0 sudo[107647]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:18 compute-0 sudo[107799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbystodsthnevsxshhekzcanrxoonwhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224077.811105-116-203316293529811/AnsiballZ_file.py'
Dec 08 20:01:18 compute-0 sudo[107799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:18 compute-0 python3.9[107801]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:18 compute-0 sudo[107799]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:18 compute-0 sudo[107951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uehdjumtrsnrjlofjdjflguzulkuzeav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224078.492062-116-267658102281364/AnsiballZ_file.py'
Dec 08 20:01:18 compute-0 sudo[107951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:19 compute-0 python3.9[107953]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:19 compute-0 sudo[107951]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:19 compute-0 sudo[108103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orhttflxgqdeuaddyqbpxhowimbbjass ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224079.1675904-116-213450436909167/AnsiballZ_file.py'
Dec 08 20:01:19 compute-0 sudo[108103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:19 compute-0 python3.9[108105]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:19 compute-0 sudo[108103]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:20 compute-0 sudo[108255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awdkibaclddoqwscclqrezrhrlkhlmta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224079.8008683-166-71558894039629/AnsiballZ_file.py'
Dec 08 20:01:20 compute-0 sudo[108255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:20 compute-0 python3.9[108257]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:20 compute-0 sudo[108255]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:20 compute-0 sudo[108407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfdaboraasubpnikmuvvpitbmrteeduf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224080.5190403-166-133604331136582/AnsiballZ_file.py'
Dec 08 20:01:20 compute-0 sudo[108407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:20 compute-0 python3.9[108409]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:20 compute-0 sudo[108407]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:21 compute-0 sudo[108559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezvmckwvgoywlrwlnlqpsywcijwtepqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224081.1105928-166-145036158959079/AnsiballZ_file.py'
Dec 08 20:01:21 compute-0 sudo[108559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:21 compute-0 python3.9[108561]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:21 compute-0 sudo[108559]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:22 compute-0 sudo[108711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqfeugclmartdgptfefyuecwlspswyfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224081.7086852-166-254927134989617/AnsiballZ_file.py'
Dec 08 20:01:22 compute-0 sudo[108711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:22 compute-0 python3.9[108713]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:22 compute-0 sudo[108711]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:22 compute-0 sudo[108863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tchkgkdvcbwssurjcwgmmjgcuaptmrdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224082.3713853-166-239155872316020/AnsiballZ_file.py'
Dec 08 20:01:22 compute-0 sudo[108863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:22 compute-0 python3.9[108865]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:22 compute-0 sudo[108863]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:23 compute-0 sudo[109032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scyyccsukcvsemrvdxvtgrdmycdfobjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224082.97791-166-180808725875334/AnsiballZ_file.py'
Dec 08 20:01:23 compute-0 sudo[109032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:23 compute-0 podman[108991]: 2025-12-08 20:01:23.285116563 +0000 UTC m=+0.052629096 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Dec 08 20:01:23 compute-0 sshd-session[108866]: Received disconnect from 159.223.8.81 port 57960:11: Bye Bye [preauth]
Dec 08 20:01:23 compute-0 sshd-session[108866]: Disconnected from authenticating user root 159.223.8.81 port 57960 [preauth]
Dec 08 20:01:23 compute-0 python3.9[109038]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:23 compute-0 sudo[109032]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:23 compute-0 sudo[109188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zecwitrlfrzvirrefxmwgvtwgmcswrev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224083.639921-166-178645724789222/AnsiballZ_file.py'
Dec 08 20:01:23 compute-0 sudo[109188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:24 compute-0 python3.9[109190]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:01:24 compute-0 sudo[109188]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:24 compute-0 sudo[109340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymxclvybhqublbcnshynaaxescoxtgrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224084.4128213-217-171494359953212/AnsiballZ_command.py'
Dec 08 20:01:24 compute-0 sudo[109340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:24 compute-0 python3.9[109342]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:25 compute-0 sudo[109340]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:25 compute-0 python3.9[109494]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 20:01:26 compute-0 sudo[109644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtyhtmokeheajxaihrqrvxjsouimhdxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224086.1399748-235-225266504374257/AnsiballZ_systemd_service.py'
Dec 08 20:01:26 compute-0 sudo[109644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:26 compute-0 python3.9[109646]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:01:26 compute-0 systemd[1]: Reloading.
Dec 08 20:01:26 compute-0 systemd-sysv-generator[109675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:01:26 compute-0 systemd-rc-local-generator[109671]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:01:27 compute-0 sudo[109644]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:27 compute-0 sudo[109831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yszjmmokhhsllsyrylzhfeclmwjbunyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224087.2686777-243-17207338416851/AnsiballZ_command.py'
Dec 08 20:01:27 compute-0 sudo[109831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:27 compute-0 podman[109833]: 2025-12-08 20:01:27.685821921 +0000 UTC m=+0.079085122 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:01:27 compute-0 python3.9[109834]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:27 compute-0 sudo[109831]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:28 compute-0 sudo[110010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjzpomuscofaosnzhuxujxznmolghevo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224087.9797218-243-38268320611175/AnsiballZ_command.py'
Dec 08 20:01:28 compute-0 sudo[110010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:28 compute-0 python3.9[110012]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:28 compute-0 sudo[110010]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:28 compute-0 sudo[110163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czxtyrppmroiotazkwwoncevmpiikyti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224088.615819-243-173829318031835/AnsiballZ_command.py'
Dec 08 20:01:28 compute-0 sudo[110163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:29 compute-0 python3.9[110165]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:29 compute-0 sudo[110163]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:29 compute-0 sudo[110316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwzfgvpxrvppmpxptmvqvnzhatpbjezb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224089.207688-243-30760186160339/AnsiballZ_command.py'
Dec 08 20:01:29 compute-0 sudo[110316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:29 compute-0 python3.9[110318]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:29 compute-0 sudo[110316]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:30 compute-0 sudo[110469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsihizpfkpsfuphhvswyueerfsuluazj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224089.8182833-243-63849969145248/AnsiballZ_command.py'
Dec 08 20:01:30 compute-0 sudo[110469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:30 compute-0 python3.9[110471]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:31 compute-0 sudo[110469]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:31 compute-0 sudo[110622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlymfpdarztwssrlvljpdeezkcwhozae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224091.6138773-243-243778423645319/AnsiballZ_command.py'
Dec 08 20:01:31 compute-0 sudo[110622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:32 compute-0 python3.9[110624]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:32 compute-0 sudo[110622]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:32 compute-0 sudo[110775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgtewlfsdvxzbortlnpkqzsikgzlnloj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224092.2017164-243-90361697095800/AnsiballZ_command.py'
Dec 08 20:01:32 compute-0 sudo[110775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:32 compute-0 python3.9[110777]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:01:32 compute-0 sudo[110775]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:33 compute-0 sudo[110928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-felyprzqfcntjbpxxfsujjqcmfanqhpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224093.0598373-297-105707832788035/AnsiballZ_getent.py'
Dec 08 20:01:33 compute-0 sudo[110928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:33 compute-0 python3.9[110930]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 08 20:01:33 compute-0 sudo[110928]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:34 compute-0 sudo[111081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grinmitbzqmlelwdwqbkytehmphajxuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224093.9624693-305-107062999959844/AnsiballZ_group.py'
Dec 08 20:01:34 compute-0 sudo[111081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:34 compute-0 python3.9[111083]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 20:01:34 compute-0 groupadd[111084]: group added to /etc/group: name=libvirt, GID=42473
Dec 08 20:01:34 compute-0 groupadd[111084]: group added to /etc/gshadow: name=libvirt
Dec 08 20:01:34 compute-0 groupadd[111084]: new group: name=libvirt, GID=42473
Dec 08 20:01:34 compute-0 sudo[111081]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:35 compute-0 sudo[111239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjtrvrfckaavvlqqslcqqmhyxtwnyoro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224095.139917-313-93034910133520/AnsiballZ_user.py'
Dec 08 20:01:35 compute-0 sudo[111239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:35 compute-0 sshd[1006]: Timeout before authentication for connection from 222.172.32.246 to 38.102.83.66, pid = 91935
Dec 08 20:01:35 compute-0 python3.9[111241]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 08 20:01:36 compute-0 useradd[111243]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Dec 08 20:01:36 compute-0 sudo[111239]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:37 compute-0 sudo[111399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqnxbggmrkrckjutlpumykyfffttihlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224096.9799984-324-120255158625624/AnsiballZ_setup.py'
Dec 08 20:01:37 compute-0 sudo[111399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:37 compute-0 python3.9[111401]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 20:01:37 compute-0 sudo[111399]: pam_unix(sudo:session): session closed for user root
Dec 08 20:01:38 compute-0 sudo[111483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgfxtvvdcdnnmcidpmsovlylxumkgjwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224096.9799984-324-120255158625624/AnsiballZ_dnf.py'
Dec 08 20:01:38 compute-0 sudo[111483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:01:38 compute-0 python3.9[111485]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 20:01:53 compute-0 podman[111637]: 2025-12-08 20:01:53.482899719 +0000 UTC m=+0.055844379 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:01:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:01:54.973 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:01:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:01:54.975 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:01:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:01:54.975 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:01:58 compute-0 podman[111701]: 2025-12-08 20:01:58.513778418 +0000 UTC m=+0.086629492 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Dec 08 20:02:00 compute-0 sshd-session[111729]: Invalid user svn from 172.190.42.55 port 45118
Dec 08 20:02:00 compute-0 sshd-session[111729]: Received disconnect from 172.190.42.55 port 45118:11: Bye Bye [preauth]
Dec 08 20:02:00 compute-0 sshd-session[111729]: Disconnected from invalid user svn 172.190.42.55 port 45118 [preauth]
Dec 08 20:02:11 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 20:02:11 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  Converting 2757 SID table entries...
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 20:02:21 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 20:02:24 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 08 20:02:24 compute-0 podman[111753]: 2025-12-08 20:02:24.499163073 +0000 UTC m=+0.062587292 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 08 20:02:28 compute-0 sshd-session[111751]: Received disconnect from 45.78.228.32 port 46148:11: Bye Bye [preauth]
Dec 08 20:02:28 compute-0 sshd-session[111751]: Disconnected from authenticating user root 45.78.228.32 port 46148 [preauth]
Dec 08 20:02:29 compute-0 podman[111772]: 2025-12-08 20:02:29.535922603 +0000 UTC m=+0.106822521 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:02:37 compute-0 sshd[1006]: drop connection #0 from [222.172.32.246]:2176 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 20:02:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:02:54.974 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:02:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:02:54.974 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:02:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:02:54.975 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:02:55 compute-0 podman[125929]: 2025-12-08 20:02:55.053800547 +0000 UTC m=+0.051195401 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:03:00 compute-0 podman[128605]: 2025-12-08 20:03:00.52026419 +0000 UTC m=+0.093103121 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:03:13 compute-0 kernel: SELinux:  Converting 2758 SID table entries...
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability open_perms=1
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability always_check_network=0
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 08 20:03:13 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 08 20:03:13 compute-0 sshd-session[128652]: Received disconnect from 172.190.42.55 port 42324:11: Bye Bye [preauth]
Dec 08 20:03:13 compute-0 sshd-session[128652]: Disconnected from authenticating user root 172.190.42.55 port 42324 [preauth]
Dec 08 20:03:14 compute-0 groupadd[128660]: group added to /etc/group: name=dnsmasq, GID=992
Dec 08 20:03:14 compute-0 groupadd[128660]: group added to /etc/gshadow: name=dnsmasq
Dec 08 20:03:14 compute-0 groupadd[128660]: new group: name=dnsmasq, GID=992
Dec 08 20:03:14 compute-0 useradd[128667]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Dec 08 20:03:14 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 20:03:14 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 08 20:03:14 compute-0 dbus-broker-launch[762]: Noticed file-system modification, trigger reload.
Dec 08 20:03:15 compute-0 groupadd[128680]: group added to /etc/group: name=clevis, GID=991
Dec 08 20:03:15 compute-0 groupadd[128680]: group added to /etc/gshadow: name=clevis
Dec 08 20:03:15 compute-0 groupadd[128680]: new group: name=clevis, GID=991
Dec 08 20:03:15 compute-0 useradd[128687]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Dec 08 20:03:15 compute-0 usermod[128697]: add 'clevis' to group 'tss'
Dec 08 20:03:15 compute-0 usermod[128697]: add 'clevis' to shadow group 'tss'
Dec 08 20:03:17 compute-0 polkitd[43925]: Reloading rules
Dec 08 20:03:17 compute-0 polkitd[43925]: Collecting garbage unconditionally...
Dec 08 20:03:17 compute-0 polkitd[43925]: Loading rules from directory /etc/polkit-1/rules.d
Dec 08 20:03:17 compute-0 polkitd[43925]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 08 20:03:17 compute-0 polkitd[43925]: Finished loading, compiling and executing 3 rules
Dec 08 20:03:17 compute-0 polkitd[43925]: Reloading rules
Dec 08 20:03:17 compute-0 polkitd[43925]: Collecting garbage unconditionally...
Dec 08 20:03:17 compute-0 polkitd[43925]: Loading rules from directory /etc/polkit-1/rules.d
Dec 08 20:03:17 compute-0 polkitd[43925]: Loading rules from directory /usr/share/polkit-1/rules.d
Dec 08 20:03:17 compute-0 polkitd[43925]: Finished loading, compiling and executing 3 rules
Dec 08 20:03:18 compute-0 groupadd[128884]: group added to /etc/group: name=ceph, GID=167
Dec 08 20:03:18 compute-0 groupadd[128884]: group added to /etc/gshadow: name=ceph
Dec 08 20:03:18 compute-0 groupadd[128884]: new group: name=ceph, GID=167
Dec 08 20:03:18 compute-0 useradd[128890]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Dec 08 20:03:22 compute-0 sshd[1006]: Received signal 15; terminating.
Dec 08 20:03:22 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Dec 08 20:03:22 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Dec 08 20:03:22 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Dec 08 20:03:22 compute-0 systemd[1]: sshd.service: Consumed 4.262s CPU time, read 32.0K from disk, written 148.0K to disk.
Dec 08 20:03:22 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Dec 08 20:03:22 compute-0 systemd[1]: Stopping sshd-keygen.target...
Dec 08 20:03:22 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 20:03:22 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 20:03:22 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 08 20:03:22 compute-0 systemd[1]: Reached target sshd-keygen.target.
Dec 08 20:03:22 compute-0 systemd[1]: Starting OpenSSH server daemon...
Dec 08 20:03:22 compute-0 sshd[129409]: Server listening on 0.0.0.0 port 22.
Dec 08 20:03:22 compute-0 sshd[129409]: Server listening on :: port 22.
Dec 08 20:03:22 compute-0 systemd[1]: Started OpenSSH server daemon.
Dec 08 20:03:23 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 20:03:23 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 20:03:24 compute-0 systemd[1]: Reloading.
Dec 08 20:03:24 compute-0 systemd-rc-local-generator[129669]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:24 compute-0 systemd-sysv-generator[129672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:24 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 20:03:25 compute-0 podman[131112]: 2025-12-08 20:03:25.486277469 +0000 UTC m=+0.054847101 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:03:27 compute-0 sudo[111483]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:28 compute-0 sudo[134468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fomuvthsjexppstyaxmxjqbfldyuxmox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224207.4250832-336-58644733640598/AnsiballZ_systemd.py'
Dec 08 20:03:28 compute-0 sudo[134468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:28 compute-0 python3.9[134490]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:03:28 compute-0 systemd[1]: Reloading.
Dec 08 20:03:28 compute-0 systemd-sysv-generator[134936]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:28 compute-0 systemd-rc-local-generator[134932]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:28 compute-0 sudo[134468]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:29 compute-0 sudo[135692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejynlktmjktmjuuozyncnldtokkusydn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224208.808381-336-274386542652296/AnsiballZ_systemd.py'
Dec 08 20:03:29 compute-0 sudo[135692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:29 compute-0 python3.9[135715]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:03:29 compute-0 systemd[1]: Reloading.
Dec 08 20:03:29 compute-0 systemd-rc-local-generator[136160]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:29 compute-0 systemd-sysv-generator[136163]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:29 compute-0 sudo[135692]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:30 compute-0 sudo[136960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbdhgkebahpeetcolakelttcncilbhhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224209.8430634-336-66117511332187/AnsiballZ_systemd.py'
Dec 08 20:03:30 compute-0 sudo[136960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:30 compute-0 python3.9[136982]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:03:30 compute-0 systemd[1]: Reloading.
Dec 08 20:03:30 compute-0 systemd-sysv-generator[137500]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:30 compute-0 systemd-rc-local-generator[137497]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:30 compute-0 sudo[136960]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:31 compute-0 podman[137696]: 2025-12-08 20:03:31.028409646 +0000 UTC m=+0.116587067 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 08 20:03:31 compute-0 sudo[138286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rstmcrsyvqyaxogskwxwkafqraagyhjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224211.0525496-336-196855259725477/AnsiballZ_systemd.py'
Dec 08 20:03:31 compute-0 sudo[138286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:31 compute-0 python3.9[138312]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:03:31 compute-0 systemd[1]: Reloading.
Dec 08 20:03:31 compute-0 systemd-rc-local-generator[138756]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:31 compute-0 systemd-sysv-generator[138759]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:31 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 20:03:31 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 20:03:31 compute-0 systemd[1]: man-db-cache-update.service: Consumed 9.723s CPU time.
Dec 08 20:03:31 compute-0 systemd[1]: run-rd8633b8222a34e4caacfa8a21b5cca01.service: Deactivated successfully.
Dec 08 20:03:31 compute-0 sudo[138286]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:32 compute-0 sudo[138995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpouflzoaxkengiadeofzqxayekdzelm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224212.047807-365-205259798846807/AnsiballZ_systemd.py'
Dec 08 20:03:32 compute-0 sudo[138995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:32 compute-0 python3.9[138997]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:32 compute-0 systemd[1]: Reloading.
Dec 08 20:03:32 compute-0 systemd-rc-local-generator[139029]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:32 compute-0 systemd-sysv-generator[139033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:32 compute-0 sudo[138995]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:33 compute-0 sudo[139185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzrerpygcbvbkcbtxjbqlonzzxcdhnfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224213.1029115-365-88409641020518/AnsiballZ_systemd.py'
Dec 08 20:03:33 compute-0 sudo[139185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:33 compute-0 python3.9[139187]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:33 compute-0 systemd[1]: Reloading.
Dec 08 20:03:33 compute-0 systemd-sysv-generator[139221]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:33 compute-0 systemd-rc-local-generator[139217]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:33 compute-0 sudo[139185]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:34 compute-0 sudo[139375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrqhtjkzvhxmyelcimksuxemhpyzrdbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224214.110805-365-102481771512119/AnsiballZ_systemd.py'
Dec 08 20:03:34 compute-0 sudo[139375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:34 compute-0 python3.9[139377]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:34 compute-0 systemd[1]: Reloading.
Dec 08 20:03:34 compute-0 systemd-rc-local-generator[139408]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:34 compute-0 systemd-sysv-generator[139411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:35 compute-0 sudo[139375]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:35 compute-0 sudo[139565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxataeiscmobcehneiyrotfwkefyfevz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224215.1803293-365-62865105349618/AnsiballZ_systemd.py'
Dec 08 20:03:35 compute-0 sudo[139565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:35 compute-0 python3.9[139567]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:35 compute-0 sudo[139565]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:36 compute-0 sudo[139720]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucwxxqrqspaybxnbiotlgnlqlodlfxmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224215.936327-365-175522729653735/AnsiballZ_systemd.py'
Dec 08 20:03:36 compute-0 sudo[139720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:36 compute-0 python3.9[139722]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:36 compute-0 systemd[1]: Reloading.
Dec 08 20:03:36 compute-0 systemd-rc-local-generator[139754]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:36 compute-0 systemd-sysv-generator[139758]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:36 compute-0 sudo[139720]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:37 compute-0 sudo[139910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ziqhhrqkduuyfspyabtmuizhndolpxxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224217.032632-401-149874462423866/AnsiballZ_systemd.py'
Dec 08 20:03:37 compute-0 sudo[139910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:37 compute-0 python3.9[139912]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 08 20:03:37 compute-0 systemd[1]: Reloading.
Dec 08 20:03:37 compute-0 systemd-sysv-generator[139945]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:03:37 compute-0 systemd-rc-local-generator[139939]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:03:38 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 08 20:03:38 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 08 20:03:38 compute-0 sudo[139910]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:38 compute-0 sudo[140102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvuyfjxcjnkhvqonhcjofpyqxkutduss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224218.206686-409-96186429824960/AnsiballZ_systemd.py'
Dec 08 20:03:38 compute-0 sudo[140102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:38 compute-0 python3.9[140104]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:38 compute-0 sudo[140102]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:39 compute-0 sudo[140257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpwutdkkwurxcvcemgrlbblmcbvefymq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224219.0348716-409-195784244778444/AnsiballZ_systemd.py'
Dec 08 20:03:39 compute-0 sudo[140257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:39 compute-0 python3.9[140259]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:39 compute-0 sudo[140257]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:40 compute-0 sudo[140412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrwedcpkjzedavyiaaxbjatldpbxpsaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224219.9156916-409-102226584507932/AnsiballZ_systemd.py'
Dec 08 20:03:40 compute-0 sudo[140412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:40 compute-0 python3.9[140414]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:40 compute-0 sudo[140412]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:40 compute-0 sudo[140567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mysyphpyibxdegkdsaivtcsbpqlcghoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224220.7074966-409-253967477046105/AnsiballZ_systemd.py'
Dec 08 20:03:40 compute-0 sudo[140567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:41 compute-0 python3.9[140569]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:41 compute-0 sudo[140567]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:41 compute-0 sudo[140722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjnpfqhjzfquyurkwiphwkppwqnikidc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224221.5168016-409-125646201563480/AnsiballZ_systemd.py'
Dec 08 20:03:41 compute-0 sudo[140722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:42 compute-0 python3.9[140724]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:42 compute-0 sudo[140722]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:42 compute-0 sudo[140877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otmylvgjekthbqdznipzmxmhuslwdxvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224222.361535-409-47370003675841/AnsiballZ_systemd.py'
Dec 08 20:03:42 compute-0 sudo[140877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:42 compute-0 python3.9[140879]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:43 compute-0 sudo[140877]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:43 compute-0 sudo[141032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzqxuonijuxqtnkhrrqzhivkrwvemwya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224223.1750298-409-179659894787693/AnsiballZ_systemd.py'
Dec 08 20:03:43 compute-0 sudo[141032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:43 compute-0 python3.9[141034]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:43 compute-0 sudo[141032]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:44 compute-0 sudo[141187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axllivmydiftcomkzaxaigpvpoilzeis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224223.954713-409-207061541232612/AnsiballZ_systemd.py'
Dec 08 20:03:44 compute-0 sudo[141187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:44 compute-0 python3.9[141189]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:44 compute-0 sudo[141187]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:45 compute-0 sudo[141342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzxxyshiydtunssjpzrlvgcqzgweluds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224224.7357924-409-160100819016289/AnsiballZ_systemd.py'
Dec 08 20:03:45 compute-0 sudo[141342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:45 compute-0 python3.9[141344]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:45 compute-0 sudo[141342]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:45 compute-0 sudo[141497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzdfsirapztngbuvgrpmvmqolycygeih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224225.5801322-409-210718829378402/AnsiballZ_systemd.py'
Dec 08 20:03:45 compute-0 sudo[141497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:46 compute-0 python3.9[141499]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:46 compute-0 sudo[141497]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:46 compute-0 sudo[141652]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hanopfaydsqpevkcidvvxiaebkfqhnav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224226.3891966-409-34022409859262/AnsiballZ_systemd.py'
Dec 08 20:03:46 compute-0 sudo[141652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:46 compute-0 python3.9[141654]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:47 compute-0 sudo[141652]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:47 compute-0 sudo[141807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkekoygjvxsksiorigzneegqhbxpgzip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224227.149612-409-278742136097695/AnsiballZ_systemd.py'
Dec 08 20:03:47 compute-0 sudo[141807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:47 compute-0 python3.9[141809]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:47 compute-0 sudo[141807]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:48 compute-0 sudo[141962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkwnjptnyeusqjtgysyujevtnnqbfjlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224227.9001207-409-14620953090274/AnsiballZ_systemd.py'
Dec 08 20:03:48 compute-0 sudo[141962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:48 compute-0 python3.9[141964]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:48 compute-0 sudo[141962]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:49 compute-0 sudo[142117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqrowllmdouqekekrvdsbkxsilzbcmdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224228.7305784-409-100843053443030/AnsiballZ_systemd.py'
Dec 08 20:03:49 compute-0 sudo[142117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:49 compute-0 python3.9[142119]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 08 20:03:49 compute-0 sudo[142117]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:49 compute-0 sudo[142272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-reffbisyxrpqumipfasbjtfhtjmxnfgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224229.723118-511-49085869498884/AnsiballZ_file.py'
Dec 08 20:03:49 compute-0 sudo[142272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:50 compute-0 python3.9[142274]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:50 compute-0 sudo[142272]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:50 compute-0 sudo[142424]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pehqauinsjqonutemeethdetofisnkmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224230.3253672-511-12244806724550/AnsiballZ_file.py'
Dec 08 20:03:50 compute-0 sudo[142424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:50 compute-0 python3.9[142426]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:50 compute-0 sudo[142424]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:51 compute-0 sudo[142576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xanccrhufmmchxmqjaqjkzcohdyohkym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224231.0230439-511-120575207404988/AnsiballZ_file.py'
Dec 08 20:03:51 compute-0 sudo[142576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:51 compute-0 python3.9[142578]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:51 compute-0 sudo[142576]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:51 compute-0 sudo[142728]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbpjjcrmswemkdcmwlbwltgpurshxdcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224231.6355608-511-142536765009436/AnsiballZ_file.py'
Dec 08 20:03:51 compute-0 sudo[142728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:52 compute-0 python3.9[142730]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:52 compute-0 sudo[142728]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:52 compute-0 sudo[142880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plhlqtkwcvraeomyhjzlxlaxghatyrmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224232.297451-511-132110043699388/AnsiballZ_file.py'
Dec 08 20:03:52 compute-0 sudo[142880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:52 compute-0 python3.9[142882]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:52 compute-0 sudo[142880]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:53 compute-0 sudo[143032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grgxyxzspevknkuapvkgvqedowqtgxxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224232.9589639-511-169054355795677/AnsiballZ_file.py'
Dec 08 20:03:53 compute-0 sudo[143032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:53 compute-0 python3.9[143034]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:03:53 compute-0 sudo[143032]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:54 compute-0 sudo[143184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bibejyvgjjdtytrziwpmbbzohthqevvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224233.6665568-554-164640838361605/AnsiballZ_stat.py'
Dec 08 20:03:54 compute-0 sudo[143184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:54 compute-0 python3.9[143186]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:03:54 compute-0 sudo[143184]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:03:54.975 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:03:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:03:54.977 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:03:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:03:54.977 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:03:55 compute-0 sudo[143309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahcixxyrtjuzlkpcgigtuwfofzkzmekw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224233.6665568-554-164640838361605/AnsiballZ_copy.py'
Dec 08 20:03:55 compute-0 sudo[143309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:55 compute-0 python3.9[143311]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224233.6665568-554-164640838361605/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:03:55 compute-0 sudo[143309]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:55 compute-0 sudo[143477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmtfanpumxvughaiadxjxtuerynvdkxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224235.403571-554-8049981836341/AnsiballZ_stat.py'
Dec 08 20:03:55 compute-0 sudo[143477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:55 compute-0 podman[143435]: 2025-12-08 20:03:55.765877416 +0000 UTC m=+0.083034712 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:03:55 compute-0 python3.9[143481]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:03:56 compute-0 sudo[143477]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:56 compute-0 sudo[143606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elzxgiyyjmtrtnvcmzmomojpwsppdbyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224235.403571-554-8049981836341/AnsiballZ_copy.py'
Dec 08 20:03:56 compute-0 sudo[143606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:56 compute-0 python3.9[143608]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224235.403571-554-8049981836341/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:03:56 compute-0 sudo[143606]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:57 compute-0 sudo[143758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igwburbxqrkfniectwokxtytiuhlzxlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224236.729655-554-47435935812271/AnsiballZ_stat.py'
Dec 08 20:03:57 compute-0 sudo[143758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:57 compute-0 python3.9[143760]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:03:57 compute-0 sudo[143758]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:57 compute-0 sudo[143883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmaiokudkfovkgbzhpchbkpprnikbvid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224236.729655-554-47435935812271/AnsiballZ_copy.py'
Dec 08 20:03:57 compute-0 sudo[143883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:57 compute-0 python3.9[143885]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224236.729655-554-47435935812271/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:03:57 compute-0 sudo[143883]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:58 compute-0 sudo[144035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avpuhovwmqeboezsenphaecfkehludqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224238.008787-554-36976096350412/AnsiballZ_stat.py'
Dec 08 20:03:58 compute-0 sudo[144035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:58 compute-0 python3.9[144037]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:03:58 compute-0 sudo[144035]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:58 compute-0 sudo[144160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzpzvrhxwdggshmizmpfufieifenbesv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224238.008787-554-36976096350412/AnsiballZ_copy.py'
Dec 08 20:03:58 compute-0 sudo[144160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:59 compute-0 python3.9[144162]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224238.008787-554-36976096350412/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:03:59 compute-0 sudo[144160]: pam_unix(sudo:session): session closed for user root
Dec 08 20:03:59 compute-0 sudo[144312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvzfpavvmjurszbxbdqpvskuohojzjsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224239.2767854-554-106170577498031/AnsiballZ_stat.py'
Dec 08 20:03:59 compute-0 sudo[144312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:03:59 compute-0 python3.9[144314]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:03:59 compute-0 sudo[144312]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:00 compute-0 sudo[144437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksdwajggaxlqbylsszndulwkltgwltup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224239.2767854-554-106170577498031/AnsiballZ_copy.py'
Dec 08 20:04:00 compute-0 sudo[144437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:00 compute-0 python3.9[144439]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224239.2767854-554-106170577498031/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:00 compute-0 sudo[144437]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:00 compute-0 sudo[144589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftmiegeadrtduvdbtlohshupdkjncmue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224240.4428144-554-85303543530324/AnsiballZ_stat.py'
Dec 08 20:04:00 compute-0 sudo[144589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:00 compute-0 python3.9[144591]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:00 compute-0 sudo[144589]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:01 compute-0 sudo[144727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnoahuhzxrrjolokkzgundmezanowjdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224240.4428144-554-85303543530324/AnsiballZ_copy.py'
Dec 08 20:04:01 compute-0 sudo[144727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:01 compute-0 podman[144688]: 2025-12-08 20:04:01.371756898 +0000 UTC m=+0.103612193 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:04:01 compute-0 python3.9[144735]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224240.4428144-554-85303543530324/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:01 compute-0 sudo[144727]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:01 compute-0 sudo[144892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqegfprhudvagolpsoesledbbelgdesb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224241.7046018-554-178845546240214/AnsiballZ_stat.py'
Dec 08 20:04:02 compute-0 sudo[144892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:02 compute-0 python3.9[144894]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:02 compute-0 sudo[144892]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:02 compute-0 sudo[145015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htkltelqfoqapqbqvrervhrbybvsxigu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224241.7046018-554-178845546240214/AnsiballZ_copy.py'
Dec 08 20:04:02 compute-0 sudo[145015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:02 compute-0 python3.9[145017]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224241.7046018-554-178845546240214/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:02 compute-0 sudo[145015]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:03 compute-0 sudo[145167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpmuybutjkanatvjcmcvujepjzvixwia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224242.8531034-554-38449516067882/AnsiballZ_stat.py'
Dec 08 20:04:03 compute-0 sudo[145167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:03 compute-0 python3.9[145169]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:03 compute-0 sudo[145167]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:03 compute-0 sudo[145292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdttqcdshtofftvkxlkygvnzxjsfdlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224242.8531034-554-38449516067882/AnsiballZ_copy.py'
Dec 08 20:04:03 compute-0 sudo[145292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:03 compute-0 python3.9[145294]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765224242.8531034-554-38449516067882/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:03 compute-0 sudo[145292]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:04 compute-0 sudo[145444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztqvcufmpidpqnksqxhgjrryqkeypkil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224244.060051-667-200923541349308/AnsiballZ_command.py'
Dec 08 20:04:04 compute-0 sudo[145444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:04 compute-0 python3.9[145446]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 08 20:04:04 compute-0 sudo[145444]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:05 compute-0 sudo[145597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxjzgffkzfxazctyflkeoatrrmegzsdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224244.777426-676-64963737530547/AnsiballZ_file.py'
Dec 08 20:04:05 compute-0 sudo[145597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:05 compute-0 python3.9[145599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:05 compute-0 sudo[145597]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:05 compute-0 sudo[145749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggwqoqjzjagfjhesontyjvbeovwfxvvj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224245.4222944-676-121387991171111/AnsiballZ_file.py'
Dec 08 20:04:05 compute-0 sudo[145749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:05 compute-0 python3.9[145751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:05 compute-0 sudo[145749]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:06 compute-0 sudo[145901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayyoosunfhekmxqpmuyrpxinmlfcnfhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224246.0981715-676-148914433890728/AnsiballZ_file.py'
Dec 08 20:04:06 compute-0 sudo[145901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:06 compute-0 python3.9[145903]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:06 compute-0 sudo[145901]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:07 compute-0 sudo[146053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-siftonfjyquvzddosmvkuwcvegwcbhdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224246.7826948-676-173992696309947/AnsiballZ_file.py'
Dec 08 20:04:07 compute-0 sudo[146053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:07 compute-0 python3.9[146055]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:07 compute-0 sudo[146053]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:07 compute-0 sudo[146205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-heqztyottqljvaeqgoihtrjzcbafhgoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224247.403696-676-75223961724774/AnsiballZ_file.py'
Dec 08 20:04:07 compute-0 sudo[146205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:07 compute-0 python3.9[146207]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:07 compute-0 sudo[146205]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:08 compute-0 sudo[146357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcuqaqwkeattfbrdwcovjvhqtvxiqkfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224248.064873-676-150237693909914/AnsiballZ_file.py'
Dec 08 20:04:08 compute-0 sudo[146357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:08 compute-0 python3.9[146359]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:08 compute-0 sudo[146357]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:08 compute-0 sudo[146509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjeboihwlhnwigenmjxbccdptoaijjaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224248.7184002-676-221173213954782/AnsiballZ_file.py'
Dec 08 20:04:08 compute-0 sudo[146509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:09 compute-0 python3.9[146511]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:09 compute-0 sudo[146509]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:09 compute-0 sudo[146661]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgmvnsncsophelvmyqcswbcfavwaelhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224249.3691196-676-269839623414529/AnsiballZ_file.py'
Dec 08 20:04:09 compute-0 sudo[146661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:09 compute-0 python3.9[146663]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:09 compute-0 sudo[146661]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:10 compute-0 sudo[146813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmltssvlreuaxbueazcrlbmyxejdsiuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224250.046344-676-110296273529690/AnsiballZ_file.py'
Dec 08 20:04:10 compute-0 sudo[146813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:10 compute-0 python3.9[146815]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:10 compute-0 sudo[146813]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:11 compute-0 sudo[146965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqyylussfgqweygrvtkorkxkvruxtwrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224250.7060459-676-280233833912318/AnsiballZ_file.py'
Dec 08 20:04:11 compute-0 sudo[146965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:11 compute-0 python3.9[146967]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:11 compute-0 sudo[146965]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:11 compute-0 sudo[147117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atqgygselgdeodxvltkzjrjhhsdodakt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224251.3736098-676-228761104575246/AnsiballZ_file.py'
Dec 08 20:04:11 compute-0 sudo[147117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:11 compute-0 python3.9[147119]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:11 compute-0 sudo[147117]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:12 compute-0 sudo[147269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzqoyvataymvvpluqkhobqgoidwsxgdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224252.0398333-676-212995023320090/AnsiballZ_file.py'
Dec 08 20:04:12 compute-0 sudo[147269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:12 compute-0 python3.9[147271]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:12 compute-0 sudo[147269]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:12 compute-0 sudo[147421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlnuopaeegkaujavtohiuvboetoqiaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224252.6612802-676-61871897105919/AnsiballZ_file.py'
Dec 08 20:04:12 compute-0 sudo[147421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:13 compute-0 python3.9[147423]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:13 compute-0 sudo[147421]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:13 compute-0 sudo[147573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncwzmeonoceklvicmvtjfogaufrywunx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224253.366443-676-133588722402310/AnsiballZ_file.py'
Dec 08 20:04:13 compute-0 sudo[147573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:13 compute-0 python3.9[147575]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:13 compute-0 sudo[147573]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:14 compute-0 sudo[147725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhfmfjqjagjnztacuxapoyilzsauixld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224254.0846782-775-93315784271234/AnsiballZ_stat.py'
Dec 08 20:04:14 compute-0 sudo[147725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:14 compute-0 python3.9[147727]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:14 compute-0 sudo[147725]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:14 compute-0 sudo[147848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thvcejmkjhfycmwlzwucgzxmjaegqugv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224254.0846782-775-93315784271234/AnsiballZ_copy.py'
Dec 08 20:04:14 compute-0 sudo[147848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:15 compute-0 python3.9[147850]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224254.0846782-775-93315784271234/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:15 compute-0 sudo[147848]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:15 compute-0 sudo[148000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgmmvqhnbidadfplvftaddukettbofwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224255.347719-775-137353093274834/AnsiballZ_stat.py'
Dec 08 20:04:15 compute-0 sudo[148000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:15 compute-0 python3.9[148002]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:15 compute-0 sudo[148000]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:16 compute-0 sudo[148123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mactbpqctuslauvyjxabksvnypwspzim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224255.347719-775-137353093274834/AnsiballZ_copy.py'
Dec 08 20:04:16 compute-0 sudo[148123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:16 compute-0 python3.9[148125]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224255.347719-775-137353093274834/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:16 compute-0 sudo[148123]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:16 compute-0 sudo[148275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icuzhofmphalljxljgmdunphuyjwiinb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224256.4768178-775-153422526613843/AnsiballZ_stat.py'
Dec 08 20:04:16 compute-0 sudo[148275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:16 compute-0 python3.9[148277]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:16 compute-0 sudo[148275]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:17 compute-0 sudo[148398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqbapeibptzxwjpguxtbzqryklktngzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224256.4768178-775-153422526613843/AnsiballZ_copy.py'
Dec 08 20:04:17 compute-0 sudo[148398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:17 compute-0 python3.9[148400]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224256.4768178-775-153422526613843/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:17 compute-0 sudo[148398]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:17 compute-0 sudo[148550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etonbvnuodnelnghyltjiluikmekktpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224257.6713252-775-188613972132678/AnsiballZ_stat.py'
Dec 08 20:04:17 compute-0 sudo[148550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:18 compute-0 python3.9[148552]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:18 compute-0 sudo[148550]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:18 compute-0 sudo[148673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdimronajvzgxdorwzvrjpzaqdqrvqtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224257.6713252-775-188613972132678/AnsiballZ_copy.py'
Dec 08 20:04:18 compute-0 sudo[148673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:18 compute-0 python3.9[148675]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224257.6713252-775-188613972132678/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:18 compute-0 sudo[148673]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:19 compute-0 sudo[148825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmgsvzhihfrmgfhkqqcafslqllivmbmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224258.8210304-775-235731152290796/AnsiballZ_stat.py'
Dec 08 20:04:19 compute-0 sudo[148825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:19 compute-0 python3.9[148827]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:19 compute-0 sudo[148825]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:19 compute-0 sudo[148948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfkrqixgpsbgrsqcjdvzzgbekvzpefhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224258.8210304-775-235731152290796/AnsiballZ_copy.py'
Dec 08 20:04:19 compute-0 sudo[148948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:19 compute-0 python3.9[148950]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224258.8210304-775-235731152290796/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:19 compute-0 sudo[148948]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:20 compute-0 sudo[149100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhpornrragzalswtcxtzajlhvosrdtay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224259.9142525-775-216153619356200/AnsiballZ_stat.py'
Dec 08 20:04:20 compute-0 sudo[149100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:20 compute-0 python3.9[149102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:20 compute-0 sudo[149100]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:20 compute-0 sudo[149223]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlnaklqwzsduxqedsrrzfxyirpqaxqgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224259.9142525-775-216153619356200/AnsiballZ_copy.py'
Dec 08 20:04:20 compute-0 sudo[149223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:21 compute-0 python3.9[149225]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224259.9142525-775-216153619356200/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:21 compute-0 sudo[149223]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:21 compute-0 sudo[149375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-papotgraqwhpirswpqtrwrraommfqnsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224261.2163599-775-235187969338389/AnsiballZ_stat.py'
Dec 08 20:04:21 compute-0 sudo[149375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:21 compute-0 python3.9[149377]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:21 compute-0 sudo[149375]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:22 compute-0 sudo[149498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcogqwfkdloczxrvartvqchjubdmtjsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224261.2163599-775-235187969338389/AnsiballZ_copy.py'
Dec 08 20:04:22 compute-0 sudo[149498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:22 compute-0 python3.9[149500]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224261.2163599-775-235187969338389/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:22 compute-0 sudo[149498]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:22 compute-0 sudo[149650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nncaizkbasacbtjqsnmhcbektlvppjdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224262.5970764-775-257299773953955/AnsiballZ_stat.py'
Dec 08 20:04:22 compute-0 sudo[149650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:23 compute-0 python3.9[149652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:23 compute-0 sudo[149650]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:23 compute-0 sudo[149773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbtirltsmadedcvkeheamqljafgzfdes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224262.5970764-775-257299773953955/AnsiballZ_copy.py'
Dec 08 20:04:23 compute-0 sudo[149773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:23 compute-0 python3.9[149775]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224262.5970764-775-257299773953955/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:23 compute-0 sudo[149773]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:24 compute-0 sudo[149925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvigulgxpozsdhzspvuxpmrysvvqeztq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224263.9901404-775-84420100102919/AnsiballZ_stat.py'
Dec 08 20:04:24 compute-0 sudo[149925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:24 compute-0 python3.9[149927]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:24 compute-0 sudo[149925]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:24 compute-0 sudo[150048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndpznspwswhhqfmiguupmtxyvtlcugmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224263.9901404-775-84420100102919/AnsiballZ_copy.py'
Dec 08 20:04:24 compute-0 sudo[150048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:25 compute-0 python3.9[150050]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224263.9901404-775-84420100102919/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:25 compute-0 sudo[150048]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:25 compute-0 sudo[150200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujepctgtymswfpjqrumbuvzridecqcim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224265.267402-775-46632840031793/AnsiballZ_stat.py'
Dec 08 20:04:25 compute-0 sudo[150200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:25 compute-0 python3.9[150202]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:25 compute-0 sudo[150200]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:26 compute-0 podman[150297]: 2025-12-08 20:04:26.157721667 +0000 UTC m=+0.049450854 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:04:26 compute-0 sudo[150342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snmyptpfqspxsmwrfhymybvjtxmyijtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224265.267402-775-46632840031793/AnsiballZ_copy.py'
Dec 08 20:04:26 compute-0 sudo[150342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:26 compute-0 python3.9[150344]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224265.267402-775-46632840031793/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:26 compute-0 sudo[150342]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:26 compute-0 sudo[150494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdfmozyoylxianoejccffnotztgvrtuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224266.6046896-775-145596977221729/AnsiballZ_stat.py'
Dec 08 20:04:26 compute-0 sudo[150494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:27 compute-0 python3.9[150496]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:27 compute-0 sudo[150494]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:27 compute-0 sudo[150617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjqdgullwesqisxxsembdxgboxmbslyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224266.6046896-775-145596977221729/AnsiballZ_copy.py'
Dec 08 20:04:27 compute-0 sudo[150617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:27 compute-0 python3.9[150619]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224266.6046896-775-145596977221729/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:27 compute-0 sudo[150617]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:28 compute-0 sudo[150769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whkkoydmpsggeuibzcrznmvixfolumar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224267.8850558-775-263729948118229/AnsiballZ_stat.py'
Dec 08 20:04:28 compute-0 sudo[150769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:28 compute-0 python3.9[150771]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:28 compute-0 sudo[150769]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:28 compute-0 sudo[150892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmddgwygrgrvqmefyzqqfvjysdgotjsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224267.8850558-775-263729948118229/AnsiballZ_copy.py'
Dec 08 20:04:28 compute-0 sudo[150892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:28 compute-0 python3.9[150894]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224267.8850558-775-263729948118229/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:28 compute-0 sudo[150892]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:29 compute-0 sudo[151044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joooyhdpcbuhntrveyafwshyuqakygdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224269.1226501-775-269497514616726/AnsiballZ_stat.py'
Dec 08 20:04:29 compute-0 sudo[151044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:29 compute-0 python3.9[151046]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:29 compute-0 sudo[151044]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:29 compute-0 sudo[151167]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhmatowqsafzkiegfipfqvuzswoxbejr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224269.1226501-775-269497514616726/AnsiballZ_copy.py'
Dec 08 20:04:29 compute-0 sudo[151167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:30 compute-0 python3.9[151169]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224269.1226501-775-269497514616726/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:30 compute-0 sudo[151167]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:30 compute-0 sudo[151319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-olwnsnpduzrldfgdwjwnkcvhnuksixqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224270.3503888-775-7972963031378/AnsiballZ_stat.py'
Dec 08 20:04:30 compute-0 sudo[151319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:30 compute-0 python3.9[151321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:30 compute-0 sudo[151319]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:31 compute-0 sudo[151442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wovvgmseytkipsuunuabfojcavcovium ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224270.3503888-775-7972963031378/AnsiballZ_copy.py'
Dec 08 20:04:31 compute-0 sudo[151442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:31 compute-0 python3.9[151444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224270.3503888-775-7972963031378/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:31 compute-0 sudo[151442]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:31 compute-0 podman[151445]: 2025-12-08 20:04:31.517173516 +0000 UTC m=+0.088927564 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Dec 08 20:04:32 compute-0 python3.9[151621]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:04:32 compute-0 sudo[151774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayavxrqgrpwfrrlrydyqomgqyfyyabql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224272.342724-981-200787730857242/AnsiballZ_seboolean.py'
Dec 08 20:04:32 compute-0 sudo[151774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:33 compute-0 python3.9[151776]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 08 20:04:34 compute-0 sudo[151774]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:34 compute-0 sudo[151930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amwtqdxnrsbwxqhmeagjrwxhjsxqrpze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224274.434187-989-71728714201381/AnsiballZ_copy.py'
Dec 08 20:04:34 compute-0 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 08 20:04:34 compute-0 sudo[151930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:34 compute-0 python3.9[151932]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:34 compute-0 sudo[151930]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:35 compute-0 sudo[152082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-unwndtbsvgjndtyzhahiuffcmcslfdmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224275.1617155-989-167493210751915/AnsiballZ_copy.py'
Dec 08 20:04:35 compute-0 sudo[152082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:35 compute-0 python3.9[152084]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:35 compute-0 sudo[152082]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:36 compute-0 sudo[152234]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzolxylbvkdhugjcchqnztaehsxodyof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224275.8098862-989-115828526924804/AnsiballZ_copy.py'
Dec 08 20:04:36 compute-0 sudo[152234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:36 compute-0 python3.9[152236]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:36 compute-0 sudo[152234]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:36 compute-0 sudo[152386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwxxfxfruofdqconcxxbyzxoyywkuoqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224276.5150852-989-29604184496131/AnsiballZ_copy.py'
Dec 08 20:04:36 compute-0 sudo[152386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:37 compute-0 python3.9[152388]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:37 compute-0 sudo[152386]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:37 compute-0 sudo[152538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsiyqtzknbljjsfhlaqiddxkusywfyuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224277.1747632-989-9281564448685/AnsiballZ_copy.py'
Dec 08 20:04:37 compute-0 sudo[152538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:37 compute-0 python3.9[152540]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:37 compute-0 sudo[152538]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:38 compute-0 sudo[152690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edjfivmepbsrfqsdpuhwispyvhigwadp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224277.8194876-1025-110539466979686/AnsiballZ_copy.py'
Dec 08 20:04:38 compute-0 sudo[152690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:38 compute-0 python3.9[152692]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:38 compute-0 sudo[152690]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:38 compute-0 sudo[152842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-errshtmtzxpxgypvgtrcoainlwaagkzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224278.5082026-1025-164412761300057/AnsiballZ_copy.py'
Dec 08 20:04:38 compute-0 sudo[152842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:38 compute-0 python3.9[152844]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:38 compute-0 sudo[152842]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:39 compute-0 sudo[152994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgaziamypdnqzlhrrhyeyyruiaacmbca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224279.1224954-1025-195307830322733/AnsiballZ_copy.py'
Dec 08 20:04:39 compute-0 sudo[152994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:39 compute-0 python3.9[152996]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:39 compute-0 sudo[152994]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:40 compute-0 sudo[153146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qqbdnmhjrjeswzixxqgzkqnkyzfzkaeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224279.7566957-1025-84500297671372/AnsiballZ_copy.py'
Dec 08 20:04:40 compute-0 sudo[153146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:40 compute-0 python3.9[153148]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:40 compute-0 sudo[153146]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:40 compute-0 sudo[153298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfkyqoieklvrhhbnkmuyqxygpsuoikb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224280.4077117-1025-175480892508545/AnsiballZ_copy.py'
Dec 08 20:04:40 compute-0 sudo[153298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:40 compute-0 python3.9[153300]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:40 compute-0 sudo[153298]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:41 compute-0 sudo[153450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmkmuhszygrpnaeokrfcwddnfplzrelm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224281.1018224-1061-166846393067188/AnsiballZ_systemd.py'
Dec 08 20:04:41 compute-0 sudo[153450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:41 compute-0 python3.9[153452]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:04:41 compute-0 systemd[1]: Reloading.
Dec 08 20:04:41 compute-0 systemd-sysv-generator[153484]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:04:41 compute-0 systemd-rc-local-generator[153481]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:04:41 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Dec 08 20:04:41 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Dec 08 20:04:41 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 08 20:04:42 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 08 20:04:42 compute-0 systemd[1]: Starting libvirt logging daemon...
Dec 08 20:04:42 compute-0 systemd[1]: Started libvirt logging daemon.
Dec 08 20:04:42 compute-0 sudo[153450]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:42 compute-0 sudo[153644]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbptvtsbvyptncgvavgukiydbnhcarwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224282.2612393-1061-17525235067644/AnsiballZ_systemd.py'
Dec 08 20:04:42 compute-0 sudo[153644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:42 compute-0 python3.9[153646]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:04:42 compute-0 systemd[1]: Reloading.
Dec 08 20:04:43 compute-0 systemd-sysv-generator[153676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:04:43 compute-0 systemd-rc-local-generator[153673]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:04:43 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 08 20:04:43 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 08 20:04:43 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 08 20:04:43 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 08 20:04:43 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 08 20:04:43 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 08 20:04:43 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 08 20:04:43 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 08 20:04:43 compute-0 sudo[153644]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:43 compute-0 sudo[153859]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kilcjsazykkvvrbctbzkrtipkpfnalyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224283.4277964-1061-71285372650594/AnsiballZ_systemd.py'
Dec 08 20:04:43 compute-0 sudo[153859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:44 compute-0 python3.9[153861]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:04:44 compute-0 systemd[1]: Reloading.
Dec 08 20:04:44 compute-0 systemd-rc-local-generator[153889]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:04:44 compute-0 systemd-sysv-generator[153893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:04:44 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 08 20:04:44 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 08 20:04:44 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 08 20:04:44 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 08 20:04:44 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 08 20:04:44 compute-0 systemd[1]: Starting libvirt proxy daemon...
Dec 08 20:04:44 compute-0 systemd[1]: Started libvirt proxy daemon.
Dec 08 20:04:44 compute-0 sudo[153859]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:44 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 08 20:04:44 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 08 20:04:44 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 08 20:04:44 compute-0 sudo[154073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhucgoxmfoklvenvfkrefytbkwwlijkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224284.5155456-1061-109677881544926/AnsiballZ_systemd.py'
Dec 08 20:04:44 compute-0 sudo[154073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:45 compute-0 python3.9[154079]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:04:45 compute-0 systemd[1]: Reloading.
Dec 08 20:04:45 compute-0 systemd-rc-local-generator[154106]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:04:45 compute-0 systemd-sysv-generator[154110]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:04:45 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Dec 08 20:04:45 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 08 20:04:45 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 08 20:04:45 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 08 20:04:45 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 08 20:04:45 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 08 20:04:45 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 08 20:04:45 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 08 20:04:45 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 08 20:04:45 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 08 20:04:45 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 08 20:04:45 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 08 20:04:45 compute-0 sudo[154073]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:45 compute-0 setroubleshoot[153897]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6ec40573-56a7-4f4a-8ece-80ed45cbedb3
Dec 08 20:04:45 compute-0 setroubleshoot[153897]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 08 20:04:45 compute-0 setroubleshoot[153897]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 6ec40573-56a7-4f4a-8ece-80ed45cbedb3
Dec 08 20:04:45 compute-0 setroubleshoot[153897]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Dec 08 20:04:45 compute-0 sudo[154296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktunygsxaqgzskqnjfswgizosnbypfie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224285.6431994-1061-146795033059791/AnsiballZ_systemd.py'
Dec 08 20:04:45 compute-0 sudo[154296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:46 compute-0 python3.9[154298]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:04:46 compute-0 systemd[1]: Reloading.
Dec 08 20:04:46 compute-0 systemd-rc-local-generator[154325]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:04:46 compute-0 systemd-sysv-generator[154328]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:04:46 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Dec 08 20:04:46 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Dec 08 20:04:46 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 08 20:04:46 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 08 20:04:46 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 08 20:04:46 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 08 20:04:46 compute-0 systemd[1]: Starting libvirt secret daemon...
Dec 08 20:04:46 compute-0 systemd[1]: Started libvirt secret daemon.
Dec 08 20:04:46 compute-0 sudo[154296]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:47 compute-0 sudo[154507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyzkhwpmzegatztaqsnpkouihzchdxio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224286.9171796-1098-213437794845612/AnsiballZ_file.py'
Dec 08 20:04:47 compute-0 sudo[154507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:47 compute-0 python3.9[154509]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:47 compute-0 sudo[154507]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:48 compute-0 sudo[154659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivejaczlodluyqwvnsnjclrqzhgppluv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224287.7276845-1106-277475395537568/AnsiballZ_find.py'
Dec 08 20:04:48 compute-0 sudo[154659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:48 compute-0 python3.9[154661]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 20:04:48 compute-0 sudo[154659]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:48 compute-0 sudo[154813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfqdcmvxhnlnoxxvdvptasrvfuyoshvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224288.7073336-1120-6978789721321/AnsiballZ_stat.py'
Dec 08 20:04:48 compute-0 sudo[154813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:49 compute-0 python3.9[154815]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:49 compute-0 sudo[154813]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:49 compute-0 sudo[154936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjdwcszjiopslpefzhlxhgklovmefwvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224288.7073336-1120-6978789721321/AnsiballZ_copy.py'
Dec 08 20:04:49 compute-0 sudo[154936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:49 compute-0 python3.9[154938]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224288.7073336-1120-6978789721321/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:49 compute-0 sudo[154936]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:50 compute-0 sudo[155088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsbbwjinbvygwbhkjnayojvtcwritfno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224290.1008966-1136-18738342062837/AnsiballZ_file.py'
Dec 08 20:04:50 compute-0 sudo[155088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:50 compute-0 python3.9[155090]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:50 compute-0 sudo[155088]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:51 compute-0 sudo[155240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwreimzrnmtbnfsvybflymokqpvdwaxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224290.8290946-1144-219428102942115/AnsiballZ_stat.py'
Dec 08 20:04:51 compute-0 sudo[155240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:51 compute-0 python3.9[155242]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:51 compute-0 sudo[155240]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:51 compute-0 sudo[155318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twkojuztnubnlmegrlluijxrgkfyrmcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224290.8290946-1144-219428102942115/AnsiballZ_file.py'
Dec 08 20:04:51 compute-0 sudo[155318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:51 compute-0 python3.9[155320]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:51 compute-0 sudo[155318]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:52 compute-0 sudo[155470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjdpvtdwqoennpvdbrsoaljugzmvdpiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224291.9580033-1156-255486001155972/AnsiballZ_stat.py'
Dec 08 20:04:52 compute-0 sudo[155470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:52 compute-0 sshd-session[154709]: Received disconnect from 45.78.228.32 port 36790:11: Bye Bye [preauth]
Dec 08 20:04:52 compute-0 sshd-session[154709]: Disconnected from authenticating user root 45.78.228.32 port 36790 [preauth]
Dec 08 20:04:52 compute-0 python3.9[155472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:52 compute-0 sudo[155470]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:52 compute-0 sudo[155548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhzshqtyqvmpwzqwkxocjsbyrqmkmulg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224291.9580033-1156-255486001155972/AnsiballZ_file.py'
Dec 08 20:04:52 compute-0 sudo[155548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:52 compute-0 python3.9[155550]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.om6rj05f recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:52 compute-0 sudo[155548]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:53 compute-0 sudo[155700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpykdvwtkgkcorwamippgnthneiobvmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224293.15787-1168-196421210167319/AnsiballZ_stat.py'
Dec 08 20:04:53 compute-0 sudo[155700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:53 compute-0 python3.9[155702]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:53 compute-0 sudo[155700]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:53 compute-0 sudo[155778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gekjsolleaqenpqkkiglsqhqpymvnsgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224293.15787-1168-196421210167319/AnsiballZ_file.py'
Dec 08 20:04:53 compute-0 sudo[155778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:54 compute-0 python3.9[155780]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:54 compute-0 sudo[155778]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:54 compute-0 sudo[155930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-momntdisswklrdrsojykrhoenjzhkkzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224294.3252156-1181-147119647100900/AnsiballZ_command.py'
Dec 08 20:04:54 compute-0 sudo[155930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:54 compute-0 python3.9[155932]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:04:54 compute-0 sudo[155930]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:04:54.976 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:04:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:04:54.977 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:04:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:04:54.978 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:04:55 compute-0 sudo[156083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvyxhbmazypdpuvfuxdzpiumpaobvspm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224295.1065361-1189-47232241600408/AnsiballZ_edpm_nftables_from_files.py'
Dec 08 20:04:55 compute-0 sudo[156083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:55 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 08 20:04:55 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.051s CPU time.
Dec 08 20:04:55 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 08 20:04:55 compute-0 python3[156085]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 08 20:04:56 compute-0 sudo[156083]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:56 compute-0 podman[156185]: 2025-12-08 20:04:56.500682372 +0000 UTC m=+0.060096324 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 08 20:04:56 compute-0 sudo[156254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shvkcbzmttktbizkxunfndkwvygwykyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224296.2015564-1197-179075356155405/AnsiballZ_stat.py'
Dec 08 20:04:56 compute-0 sudo[156254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:56 compute-0 python3.9[156256]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:56 compute-0 sudo[156254]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:57 compute-0 sudo[156332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwearswfytncodxxbyghqgqelqycwcvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224296.2015564-1197-179075356155405/AnsiballZ_file.py'
Dec 08 20:04:57 compute-0 sudo[156332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:57 compute-0 python3.9[156334]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:57 compute-0 sudo[156332]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:57 compute-0 sudo[156484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivfpkktpozaxbeolzepvfginlqdmhnvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224297.4204066-1209-37487591911788/AnsiballZ_stat.py'
Dec 08 20:04:57 compute-0 sudo[156484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:58 compute-0 python3.9[156486]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:58 compute-0 sudo[156484]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:58 compute-0 sudo[156562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjfynqkwxgrzclnsdupwbhwcbtcdtkca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224297.4204066-1209-37487591911788/AnsiballZ_file.py'
Dec 08 20:04:58 compute-0 sudo[156562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:58 compute-0 python3.9[156564]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:58 compute-0 sudo[156562]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:58 compute-0 sudo[156714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boakgcnhyypvkpzorlerucojdaoyxocu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224298.6525555-1221-70908302830404/AnsiballZ_stat.py'
Dec 08 20:04:58 compute-0 sudo[156714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:59 compute-0 python3.9[156716]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:04:59 compute-0 sudo[156714]: pam_unix(sudo:session): session closed for user root
Dec 08 20:04:59 compute-0 sudo[156792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adtkxwmpajtzscqjtdhxkvnyhnuarnmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224298.6525555-1221-70908302830404/AnsiballZ_file.py'
Dec 08 20:04:59 compute-0 sudo[156792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:04:59 compute-0 python3.9[156794]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:04:59 compute-0 sudo[156792]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:00 compute-0 sudo[156944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhhvjqxayxydwpadsmsxakjtaxwhuprw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224300.0302284-1233-176911085919383/AnsiballZ_stat.py'
Dec 08 20:05:00 compute-0 sudo[156944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:00 compute-0 python3.9[156946]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:00 compute-0 sudo[156944]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:00 compute-0 sudo[157022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itiacsmiyqzlqakqjjfihhrmvopseovy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224300.0302284-1233-176911085919383/AnsiballZ_file.py'
Dec 08 20:05:00 compute-0 sudo[157022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:01 compute-0 python3.9[157024]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:01 compute-0 sudo[157022]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:01 compute-0 sudo[157185]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cddnuqtuuessfimajvkdpsakfiezxxpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224301.3212829-1245-57885286787566/AnsiballZ_stat.py'
Dec 08 20:05:01 compute-0 sudo[157185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:01 compute-0 podman[157148]: 2025-12-08 20:05:01.775830603 +0000 UTC m=+0.088034296 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:05:01 compute-0 python3.9[157191]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:01 compute-0 sudo[157185]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:02 compute-0 sudo[157323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swxjmwdeemewsbpuzolgrbhugwnlqtld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224301.3212829-1245-57885286787566/AnsiballZ_copy.py'
Dec 08 20:05:02 compute-0 sudo[157323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:02 compute-0 python3.9[157325]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224301.3212829-1245-57885286787566/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:02 compute-0 sudo[157323]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:02 compute-0 sudo[157475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrouyszituponkgencvjtuyfpdjsqish ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224302.6573374-1260-211844911002066/AnsiballZ_file.py'
Dec 08 20:05:02 compute-0 sudo[157475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:03 compute-0 python3.9[157477]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:03 compute-0 sudo[157475]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:03 compute-0 sudo[157627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpyoucfmzhqjfpbuhmrstmjplipapmpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224303.3489668-1268-249429246598133/AnsiballZ_command.py'
Dec 08 20:05:03 compute-0 sudo[157627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:03 compute-0 python3.9[157629]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:03 compute-0 sudo[157627]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:04 compute-0 sudo[157782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-noarqprosbdtfoskozsgjccsqtmsirud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224304.202701-1276-153504389025093/AnsiballZ_blockinfile.py'
Dec 08 20:05:04 compute-0 sudo[157782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:04 compute-0 python3.9[157784]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:04 compute-0 sudo[157782]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:05 compute-0 sudo[157934]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tphsvwhhwowyiwwxconqkbxwbqobdofg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224305.2106872-1285-215441149085569/AnsiballZ_command.py'
Dec 08 20:05:05 compute-0 sudo[157934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:05 compute-0 python3.9[157936]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:05 compute-0 sudo[157934]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:06 compute-0 sudo[158087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvluwkkpkpqosrkqdmbalyjppqpfedmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224305.87804-1293-192915629770750/AnsiballZ_stat.py'
Dec 08 20:05:06 compute-0 sudo[158087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:06 compute-0 python3.9[158089]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:05:06 compute-0 sudo[158087]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:06 compute-0 sudo[158241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjsidkyyvdqmxniyifhqesmodtciqnah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224306.5180423-1301-25425470459979/AnsiballZ_command.py'
Dec 08 20:05:06 compute-0 sudo[158241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:06 compute-0 python3.9[158243]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:07 compute-0 sudo[158241]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:07 compute-0 sudo[158396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmisnrayopxhrbgcqwnlcsbxhdqmnegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224307.1815903-1309-253061248438436/AnsiballZ_file.py'
Dec 08 20:05:07 compute-0 sudo[158396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:07 compute-0 python3.9[158398]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:07 compute-0 sudo[158396]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:08 compute-0 sudo[158548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vrbxtaxsweurhbdglqdrsadksyzqguck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224307.950108-1317-178104237467842/AnsiballZ_stat.py'
Dec 08 20:05:08 compute-0 sudo[158548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:08 compute-0 python3.9[158550]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:08 compute-0 sudo[158548]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:08 compute-0 sudo[158671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ripmvlddcqzntgvjawliofsgecucwejq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224307.950108-1317-178104237467842/AnsiballZ_copy.py'
Dec 08 20:05:08 compute-0 sudo[158671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:09 compute-0 python3.9[158673]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224307.950108-1317-178104237467842/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:09 compute-0 sudo[158671]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:09 compute-0 sudo[158823]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swtooeqcbvwcsydzyixprafcdaykvgzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224309.32814-1332-15112708614990/AnsiballZ_stat.py'
Dec 08 20:05:09 compute-0 sudo[158823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:09 compute-0 python3.9[158825]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:09 compute-0 sudo[158823]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:10 compute-0 sudo[158946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwlnjpssixarjubjitfzkimbopbomgka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224309.32814-1332-15112708614990/AnsiballZ_copy.py'
Dec 08 20:05:10 compute-0 sudo[158946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:10 compute-0 python3.9[158948]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224309.32814-1332-15112708614990/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:10 compute-0 sudo[158946]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:10 compute-0 sudo[159098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvhkfxwcrkqstlbfkpldharziatmhyvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224310.5014184-1347-166249810759394/AnsiballZ_stat.py'
Dec 08 20:05:10 compute-0 sudo[159098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:10 compute-0 python3.9[159100]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:11 compute-0 sudo[159098]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:11 compute-0 sudo[159221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esojwafrwwuvrpyfjlrubaukuzlsvicn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224310.5014184-1347-166249810759394/AnsiballZ_copy.py'
Dec 08 20:05:11 compute-0 sudo[159221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:11 compute-0 python3.9[159223]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224310.5014184-1347-166249810759394/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:11 compute-0 sudo[159221]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:12 compute-0 sudo[159374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqegvddieouxdvgkfjuubwhbhuotjnsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224311.7131653-1362-256176222606417/AnsiballZ_systemd.py'
Dec 08 20:05:12 compute-0 sudo[159374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:12 compute-0 python3.9[159376]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:05:12 compute-0 systemd[1]: Reloading.
Dec 08 20:05:12 compute-0 systemd-rc-local-generator[159405]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:05:12 compute-0 systemd-sysv-generator[159408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:05:12 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Dec 08 20:05:12 compute-0 sudo[159374]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:13 compute-0 sudo[159566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izjgbpyblzynpuuqrzqipkmwxsvfvgkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224312.8367364-1370-57669851531928/AnsiballZ_systemd.py'
Dec 08 20:05:13 compute-0 sudo[159566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:13 compute-0 python3.9[159568]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 08 20:05:13 compute-0 systemd[1]: Reloading.
Dec 08 20:05:13 compute-0 systemd-rc-local-generator[159597]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:05:13 compute-0 systemd-sysv-generator[159601]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:05:13 compute-0 systemd[1]: Reloading.
Dec 08 20:05:13 compute-0 systemd-rc-local-generator[159634]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:05:13 compute-0 systemd-sysv-generator[159639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:05:14 compute-0 sudo[159566]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:14 compute-0 sshd-session[159248]: Invalid user maarch from 101.47.160.247 port 50950
Dec 08 20:05:14 compute-0 sshd-session[159248]: Received disconnect from 101.47.160.247 port 50950:11: Bye Bye [preauth]
Dec 08 20:05:14 compute-0 sshd-session[159248]: Disconnected from invalid user maarch 101.47.160.247 port 50950 [preauth]
Dec 08 20:05:14 compute-0 sshd-session[105172]: Connection closed by 192.168.122.30 port 60506
Dec 08 20:05:14 compute-0 sshd-session[105169]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:05:14 compute-0 systemd[1]: session-22.scope: Deactivated successfully.
Dec 08 20:05:14 compute-0 systemd[1]: session-22.scope: Consumed 3min 25.838s CPU time.
Dec 08 20:05:14 compute-0 systemd-logind[793]: Session 22 logged out. Waiting for processes to exit.
Dec 08 20:05:14 compute-0 systemd-logind[793]: Removed session 22.
Dec 08 20:05:20 compute-0 sshd-session[159667]: Accepted publickey for zuul from 192.168.122.30 port 37594 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:05:20 compute-0 systemd-logind[793]: New session 23 of user zuul.
Dec 08 20:05:20 compute-0 systemd[1]: Started Session 23 of User zuul.
Dec 08 20:05:20 compute-0 sshd-session[159667]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:05:21 compute-0 python3.9[159820]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:05:22 compute-0 python3.9[159974]: ansible-ansible.builtin.service_facts Invoked
Dec 08 20:05:22 compute-0 network[159991]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 20:05:22 compute-0 network[159992]: 'network-scripts' will be removed from distribution in near future.
Dec 08 20:05:22 compute-0 network[159993]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 20:05:23 compute-0 sshd-session[160002]: Received disconnect from 172.190.42.55 port 55982:11: Bye Bye [preauth]
Dec 08 20:05:23 compute-0 sshd-session[160002]: Disconnected from authenticating user root 172.190.42.55 port 55982 [preauth]
Dec 08 20:05:26 compute-0 podman[160106]: 2025-12-08 20:05:26.638181553 +0000 UTC m=+0.065157492 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 08 20:05:27 compute-0 sudo[160283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okrzlillrxjtbvqoikrvqekgzkobcisg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224327.079537-47-174932158505018/AnsiballZ_setup.py'
Dec 08 20:05:27 compute-0 sudo[160283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:27 compute-0 python3.9[160285]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 08 20:05:28 compute-0 sudo[160283]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:28 compute-0 sudo[160367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssmiqrpjpxaxlqnvzhosretldlypmzyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224327.079537-47-174932158505018/AnsiballZ_dnf.py'
Dec 08 20:05:28 compute-0 sudo[160367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:28 compute-0 python3.9[160369]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 20:05:32 compute-0 podman[160371]: 2025-12-08 20:05:32.603693192 +0000 UTC m=+0.164820581 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:05:33 compute-0 sudo[160367]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:34 compute-0 sudo[160546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kowqxgkufmfgpzckruwlhbohvatkzzvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224334.002434-59-120265804299578/AnsiballZ_stat.py'
Dec 08 20:05:34 compute-0 sudo[160546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:34 compute-0 python3.9[160548]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:05:34 compute-0 sudo[160546]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:35 compute-0 sudo[160698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfphrtiuruycihgmxdevirfcjedpcgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224334.8939562-69-138155076295818/AnsiballZ_command.py'
Dec 08 20:05:35 compute-0 sudo[160698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:35 compute-0 python3.9[160700]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:35 compute-0 sudo[160698]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:36 compute-0 sudo[160851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uexsshxekhiljcrgrjqgvnvdlhaqrsqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224335.8778064-79-43828043777026/AnsiballZ_stat.py'
Dec 08 20:05:36 compute-0 sudo[160851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:36 compute-0 python3.9[160853]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:05:36 compute-0 sudo[160851]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:36 compute-0 sudo[161003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-typhlqbuwocvvlyuhkmdupupqsvyumnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224336.5431895-87-178267746498354/AnsiballZ_command.py'
Dec 08 20:05:36 compute-0 sudo[161003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:37 compute-0 python3.9[161005]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:37 compute-0 sudo[161003]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:37 compute-0 sudo[161156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-caftiykqscunvhifghaiyhgneqimgfwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224337.229051-95-221000578956121/AnsiballZ_stat.py'
Dec 08 20:05:37 compute-0 sudo[161156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:37 compute-0 python3.9[161158]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:37 compute-0 sudo[161156]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:38 compute-0 sudo[161279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxxlmwljqptanzgniqdjoiwqwxtmnldm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224337.229051-95-221000578956121/AnsiballZ_copy.py'
Dec 08 20:05:38 compute-0 sudo[161279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:38 compute-0 python3.9[161281]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224337.229051-95-221000578956121/.source.iscsi _original_basename=.psf1fv4h follow=False checksum=a585c8cdf9106130fec6976bbb07a1f410f38c6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:38 compute-0 sudo[161279]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:39 compute-0 sudo[161431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnmdwrkukpixtrwduvltqkvfcziakaii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224338.695033-110-2649055267344/AnsiballZ_file.py'
Dec 08 20:05:39 compute-0 sudo[161431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:39 compute-0 python3.9[161433]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:39 compute-0 sudo[161431]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:40 compute-0 sudo[161583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxakhkzubxmtyroajvjlnhhbhsphedff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224339.5704558-118-28139511191602/AnsiballZ_lineinfile.py'
Dec 08 20:05:40 compute-0 sudo[161583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:40 compute-0 python3.9[161585]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:40 compute-0 sudo[161583]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 20:05:40 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 20:05:41 compute-0 sudo[161736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbhwzjdetojdqchneoyhtpniaptsxjoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224340.4846737-127-240047087688959/AnsiballZ_systemd_service.py'
Dec 08 20:05:41 compute-0 sudo[161736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:41 compute-0 python3.9[161738]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:05:41 compute-0 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 08 20:05:41 compute-0 sudo[161736]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:41 compute-0 sudo[161892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evivkmfbhsivlssrqkmrgomeugspdhxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224341.608939-135-199492701733032/AnsiballZ_systemd_service.py'
Dec 08 20:05:41 compute-0 sudo[161892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:42 compute-0 python3.9[161894]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:05:42 compute-0 systemd[1]: Reloading.
Dec 08 20:05:42 compute-0 systemd-sysv-generator[161926]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:05:42 compute-0 systemd-rc-local-generator[161921]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:05:42 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 08 20:05:42 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 08 20:05:42 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Dec 08 20:05:42 compute-0 systemd[1]: Started Open-iSCSI.
Dec 08 20:05:42 compute-0 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 08 20:05:42 compute-0 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 08 20:05:42 compute-0 sudo[161892]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:43 compute-0 sudo[162093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqtveaurconevbvdhcblyutaorioaria ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224342.9552934-146-50169622998333/AnsiballZ_service_facts.py'
Dec 08 20:05:43 compute-0 sudo[162093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:43 compute-0 python3.9[162095]: ansible-ansible.builtin.service_facts Invoked
Dec 08 20:05:43 compute-0 network[162112]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 20:05:43 compute-0 network[162113]: 'network-scripts' will be removed from distribution in near future.
Dec 08 20:05:43 compute-0 network[162114]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 20:05:46 compute-0 sudo[162093]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:47 compute-0 sudo[162383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keontzsaswtirkhjgjillqonzgyofibt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224347.2412324-156-262774995483756/AnsiballZ_file.py'
Dec 08 20:05:47 compute-0 sudo[162383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:47 compute-0 python3.9[162385]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 08 20:05:47 compute-0 sudo[162383]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:48 compute-0 sudo[162535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvwtjylrarmmomcaaujwmjfudempklhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224347.9229715-164-125845260803926/AnsiballZ_modprobe.py'
Dec 08 20:05:48 compute-0 sudo[162535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:48 compute-0 python3.9[162537]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 08 20:05:48 compute-0 sudo[162535]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:49 compute-0 sudo[162691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cafrgcqrwouukqoqdimafoizuxinbjrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224349.1009684-172-59502646879469/AnsiballZ_stat.py'
Dec 08 20:05:49 compute-0 sudo[162691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:49 compute-0 python3.9[162693]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:49 compute-0 sudo[162691]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:49 compute-0 sudo[162814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hewgmvxgfcgeihjewlmdcsiecyxenssd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224349.1009684-172-59502646879469/AnsiballZ_copy.py'
Dec 08 20:05:49 compute-0 sudo[162814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:50 compute-0 python3.9[162816]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224349.1009684-172-59502646879469/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:50 compute-0 sudo[162814]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:50 compute-0 sudo[162966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzazooroxeymcardvelgcrlbewpzievr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224350.378075-188-3077855355996/AnsiballZ_lineinfile.py'
Dec 08 20:05:50 compute-0 sudo[162966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:50 compute-0 python3.9[162968]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:50 compute-0 sudo[162966]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:51 compute-0 sudo[163118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyhrgaoxlnsxljehpqctvpsnagzccvpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224351.0499096-196-59967218131944/AnsiballZ_systemd.py'
Dec 08 20:05:51 compute-0 sudo[163118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:52 compute-0 python3.9[163120]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:05:52 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 08 20:05:52 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 08 20:05:52 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 08 20:05:52 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 08 20:05:52 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 08 20:05:52 compute-0 sudo[163118]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:52 compute-0 sudo[163274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvejptgondenzbcikfryiaqnqjxtvuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224352.2872205-204-174366842082273/AnsiballZ_file.py'
Dec 08 20:05:52 compute-0 sudo[163274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:52 compute-0 python3.9[163276]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:05:52 compute-0 sudo[163274]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:53 compute-0 sudo[163426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxuyotgtlihclzaachvrqnmttomidczl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224352.9746096-213-48435210749762/AnsiballZ_stat.py'
Dec 08 20:05:53 compute-0 sudo[163426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:53 compute-0 python3.9[163428]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:05:53 compute-0 sudo[163426]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:53 compute-0 sudo[163578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laaqjnmjkudahsuhsqbmpqdtcxqphggj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224353.6660135-222-136526720200582/AnsiballZ_stat.py'
Dec 08 20:05:53 compute-0 sudo[163578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:54 compute-0 python3.9[163580]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:05:54 compute-0 sudo[163578]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:54 compute-0 sudo[163730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zruvxknnhzxaqhtqgemiimkqygbbbegt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224354.318878-230-81082832153249/AnsiballZ_stat.py'
Dec 08 20:05:54 compute-0 sudo[163730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:54 compute-0 python3.9[163732]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:05:54 compute-0 sudo[163730]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:05:54.977 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:05:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:05:54.978 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:05:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:05:54.979 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:05:55 compute-0 sudo[163853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ouzeuyfwvvttxewgaaqlsjxqymjkscvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224354.318878-230-81082832153249/AnsiballZ_copy.py'
Dec 08 20:05:55 compute-0 sudo[163853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:55 compute-0 python3.9[163855]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224354.318878-230-81082832153249/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:55 compute-0 sudo[163853]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:55 compute-0 sudo[164005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxwervmitskzwlxtmuxgjwyutdndtone ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224355.5206647-245-148046981169753/AnsiballZ_command.py'
Dec 08 20:05:55 compute-0 sudo[164005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:55 compute-0 python3.9[164007]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:05:56 compute-0 sudo[164005]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:56 compute-0 sudo[164158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykkeylvgujhtiloliuamntpbczbtmgtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224356.173228-253-217940967460852/AnsiballZ_lineinfile.py'
Dec 08 20:05:56 compute-0 sudo[164158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:56 compute-0 python3.9[164160]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:56 compute-0 sudo[164158]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:57 compute-0 sudo[164319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwxrylkuknlunbjhwvmjtqpdcxfoensq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224356.7766087-261-163254789249505/AnsiballZ_replace.py'
Dec 08 20:05:57 compute-0 sudo[164319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:57 compute-0 podman[164284]: 2025-12-08 20:05:57.264415715 +0000 UTC m=+0.051985653 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Dec 08 20:05:57 compute-0 python3.9[164331]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:57 compute-0 sudo[164319]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:58 compute-0 sudo[164482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkbggbcghcotyhhljsoymnmufeoplhla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224357.9435058-269-23692503565891/AnsiballZ_replace.py'
Dec 08 20:05:58 compute-0 sudo[164482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:58 compute-0 python3.9[164484]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:58 compute-0 sudo[164482]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:59 compute-0 sudo[164634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnddkjkmbdydbsorumbpvbdxcdxlcxiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224358.662806-278-279644589646219/AnsiballZ_lineinfile.py'
Dec 08 20:05:59 compute-0 sudo[164634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:59 compute-0 python3.9[164636]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:59 compute-0 sudo[164634]: pam_unix(sudo:session): session closed for user root
Dec 08 20:05:59 compute-0 sudo[164786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxdtjhcwvqypgbbyhvzfzazpzinizzwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224359.3806093-278-193917948231393/AnsiballZ_lineinfile.py'
Dec 08 20:05:59 compute-0 sudo[164786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:05:59 compute-0 python3.9[164788]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:05:59 compute-0 sudo[164786]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:00 compute-0 sudo[164938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvxzpjtzbdwxlgratxqerkunugzmhzep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224360.0300083-278-26692271582326/AnsiballZ_lineinfile.py'
Dec 08 20:06:00 compute-0 sudo[164938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:00 compute-0 python3.9[164940]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:00 compute-0 sudo[164938]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:00 compute-0 sudo[165090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pckgrjusmkeztzyklkzgbpehwgpcxlhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224360.6907248-278-99336241416289/AnsiballZ_lineinfile.py'
Dec 08 20:06:00 compute-0 sudo[165090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:01 compute-0 python3.9[165092]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:01 compute-0 sudo[165090]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:01 compute-0 sudo[165242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtjzrfxbpsporfwzokojxxazektuhszd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224361.3781404-307-75622270766472/AnsiballZ_stat.py'
Dec 08 20:06:01 compute-0 sudo[165242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:02 compute-0 python3.9[165244]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:06:02 compute-0 sudo[165242]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:02 compute-0 sudo[165396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnhvwwrtkvpuolopbeikinjpdhsqwgvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224362.32331-315-222195546017570/AnsiballZ_file.py'
Dec 08 20:06:02 compute-0 sudo[165396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:02 compute-0 podman[165398]: 2025-12-08 20:06:02.841913523 +0000 UTC m=+0.128989433 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 08 20:06:02 compute-0 python3.9[165399]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:02 compute-0 sudo[165396]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:03 compute-0 sudo[165574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzowtmtvowahobczicxrzxdptmwhxagg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224363.2046134-324-229072347353169/AnsiballZ_file.py'
Dec 08 20:06:03 compute-0 sudo[165574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:03 compute-0 python3.9[165576]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:03 compute-0 sudo[165574]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:04 compute-0 sudo[165726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whzyzmjtgjvyakizlikaqxyufmbszufi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224363.9047346-332-150000013104830/AnsiballZ_stat.py'
Dec 08 20:06:04 compute-0 sudo[165726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:04 compute-0 python3.9[165728]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:04 compute-0 sudo[165726]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:04 compute-0 sudo[165804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxtmitugsdybcyutroewjspnmefuaukh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224363.9047346-332-150000013104830/AnsiballZ_file.py'
Dec 08 20:06:04 compute-0 sudo[165804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:04 compute-0 python3.9[165806]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:04 compute-0 sudo[165804]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:05 compute-0 sudo[165956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwkbjppliozmxqflarzralpkudigdzyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224364.9874284-332-39949502021958/AnsiballZ_stat.py'
Dec 08 20:06:05 compute-0 sudo[165956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:05 compute-0 python3.9[165958]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:05 compute-0 sudo[165956]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:05 compute-0 sudo[166034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twhqeldyouptpngtxfymuduikrgaqzwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224364.9874284-332-39949502021958/AnsiballZ_file.py'
Dec 08 20:06:05 compute-0 sudo[166034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:05 compute-0 python3.9[166036]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:05 compute-0 sudo[166034]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:06 compute-0 sudo[166186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-loprnqlexcasrouhraxmjmxugagrpjji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224366.3504388-355-224008290340273/AnsiballZ_file.py'
Dec 08 20:06:06 compute-0 sudo[166186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:06 compute-0 python3.9[166188]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:06 compute-0 sudo[166186]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:07 compute-0 sudo[166338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkgckmrdbwjwlpeyvxqapttwrtqkquaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224367.0566366-363-93831800168488/AnsiballZ_stat.py'
Dec 08 20:06:07 compute-0 sudo[166338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:07 compute-0 python3.9[166340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:07 compute-0 sudo[166338]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:07 compute-0 sudo[166416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctfpcyzzedtbhcbrftnfsbzdhsfoltzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224367.0566366-363-93831800168488/AnsiballZ_file.py'
Dec 08 20:06:07 compute-0 sudo[166416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:08 compute-0 python3.9[166418]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:08 compute-0 sudo[166416]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:08 compute-0 sudo[166568]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znqpvvcyuygtugghqaphiivqlbpembau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224368.218778-375-7187915778663/AnsiballZ_stat.py'
Dec 08 20:06:08 compute-0 sudo[166568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:08 compute-0 python3.9[166570]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:08 compute-0 sudo[166568]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:08 compute-0 sudo[166646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcyfqogmyqthtzefnlqgncbscmwasloo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224368.218778-375-7187915778663/AnsiballZ_file.py'
Dec 08 20:06:08 compute-0 sudo[166646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:09 compute-0 python3.9[166648]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:09 compute-0 sudo[166646]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:09 compute-0 sudo[166798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mthxzhvistxtxoczuwvhedrmncxqbgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224369.3425577-387-188191253430147/AnsiballZ_systemd.py'
Dec 08 20:06:09 compute-0 sudo[166798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:09 compute-0 python3.9[166800]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:09 compute-0 systemd[1]: Reloading.
Dec 08 20:06:10 compute-0 systemd-rc-local-generator[166824]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:10 compute-0 systemd-sysv-generator[166830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:10 compute-0 sudo[166798]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:10 compute-0 sudo[166987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juipfesafdwxdpdeflirtggolvzmfakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224370.4295006-395-45533511881058/AnsiballZ_stat.py'
Dec 08 20:06:10 compute-0 sudo[166987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:10 compute-0 python3.9[166989]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:11 compute-0 sudo[166987]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:11 compute-0 sudo[167065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jitkvfjphlxokxxpelbvbygldadrfxwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224370.4295006-395-45533511881058/AnsiballZ_file.py'
Dec 08 20:06:11 compute-0 sudo[167065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:11 compute-0 python3.9[167067]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:11 compute-0 sudo[167065]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:11 compute-0 sudo[167217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boykmcwjbbwkloopqqyjeuhejagmjmrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224371.6188357-407-195494240759962/AnsiballZ_stat.py'
Dec 08 20:06:11 compute-0 sudo[167217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:12 compute-0 python3.9[167219]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:12 compute-0 sudo[167217]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:12 compute-0 sudo[167295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnkymacrspqwxrexxqrnyurlefctzgxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224371.6188357-407-195494240759962/AnsiballZ_file.py'
Dec 08 20:06:12 compute-0 sudo[167295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:12 compute-0 python3.9[167297]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:12 compute-0 sudo[167295]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:13 compute-0 sudo[167447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpnxbjhtxyygpzajbntulgegntkguoua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224372.7345853-419-125338614580191/AnsiballZ_systemd.py'
Dec 08 20:06:13 compute-0 sudo[167447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:13 compute-0 python3.9[167449]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:13 compute-0 systemd[1]: Reloading.
Dec 08 20:06:13 compute-0 systemd-sysv-generator[167480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:13 compute-0 systemd-rc-local-generator[167472]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:13 compute-0 systemd[1]: Starting Create netns directory...
Dec 08 20:06:13 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 08 20:06:13 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 08 20:06:13 compute-0 systemd[1]: Finished Create netns directory.
Dec 08 20:06:13 compute-0 sudo[167447]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:14 compute-0 sudo[167642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtgpegclnlgibnxpblioeioyoqsqgdmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224374.1177497-429-127566821840263/AnsiballZ_file.py'
Dec 08 20:06:14 compute-0 sudo[167642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:14 compute-0 python3.9[167644]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:14 compute-0 sudo[167642]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:15 compute-0 sudo[167794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpyriyeovpwshbnyjydjfkyzjtehzvyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224374.8026803-437-139957767501082/AnsiballZ_stat.py'
Dec 08 20:06:15 compute-0 sudo[167794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:15 compute-0 python3.9[167796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:15 compute-0 sudo[167794]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:15 compute-0 sudo[167917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvmqgpsnuvtfymuirvjvgivmazloelti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224374.8026803-437-139957767501082/AnsiballZ_copy.py'
Dec 08 20:06:15 compute-0 sudo[167917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:15 compute-0 python3.9[167919]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224374.8026803-437-139957767501082/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:15 compute-0 sudo[167917]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:16 compute-0 sudo[168069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jqsshpbbhbhoopckwkvzngpcshiassju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224376.3170884-454-223826290700463/AnsiballZ_file.py'
Dec 08 20:06:16 compute-0 sudo[168069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:16 compute-0 python3.9[168071]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:06:16 compute-0 sudo[168069]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:17 compute-0 sudo[168221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmmamhcpxvjprddjvvmtjijscbwlkgix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224377.0699537-462-36530500133275/AnsiballZ_stat.py'
Dec 08 20:06:17 compute-0 sudo[168221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:17 compute-0 python3.9[168223]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:17 compute-0 sudo[168221]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:18 compute-0 sudo[168344]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twisgjdgoramnzlyawanprfrnuluqxuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224377.0699537-462-36530500133275/AnsiballZ_copy.py'
Dec 08 20:06:18 compute-0 sudo[168344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:18 compute-0 python3.9[168346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224377.0699537-462-36530500133275/.source.json _original_basename=.7cwiz0h2 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:18 compute-0 sudo[168344]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:18 compute-0 sudo[168496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tiiwgqsaduvedrzpukcevcrffcrvjgml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224378.4539428-477-31182077582369/AnsiballZ_file.py'
Dec 08 20:06:18 compute-0 sudo[168496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:18 compute-0 python3.9[168498]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:18 compute-0 sudo[168496]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:19 compute-0 sudo[168648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lzgftonwqiecogzhcdhscskrkmynxsln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224379.190636-485-43051006981417/AnsiballZ_stat.py'
Dec 08 20:06:19 compute-0 sudo[168648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:19 compute-0 sudo[168648]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:20 compute-0 sudo[168771]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmgdzlbuzsldeteexucsefwxjtwxykyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224379.190636-485-43051006981417/AnsiballZ_copy.py'
Dec 08 20:06:20 compute-0 sudo[168771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:20 compute-0 sudo[168771]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:21 compute-0 sudo[168923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsceggjshkuuvhjxmhezbsrisvzoehgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224380.8247921-502-111900763125763/AnsiballZ_container_config_data.py'
Dec 08 20:06:21 compute-0 sudo[168923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:21 compute-0 python3.9[168925]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 08 20:06:21 compute-0 sudo[168923]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:22 compute-0 sudo[169075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilgquhtbcvmizzacxogtqlkjxmilxqjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224381.801868-511-59814124967183/AnsiballZ_container_config_hash.py'
Dec 08 20:06:22 compute-0 sudo[169075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:22 compute-0 python3.9[169077]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:06:22 compute-0 sudo[169075]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:23 compute-0 sudo[169227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uoxsiacrivibfpvfbmqjlykujzofaoae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224382.7719598-520-172256900209358/AnsiballZ_podman_container_info.py'
Dec 08 20:06:23 compute-0 sudo[169227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:23 compute-0 python3.9[169229]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 08 20:06:23 compute-0 sudo[169227]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:24 compute-0 sudo[169405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuxiunojhxdsnuthkhdixfxanatznqaj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224384.2433748-533-162401677680581/AnsiballZ_edpm_container_manage.py'
Dec 08 20:06:24 compute-0 sudo[169405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:25 compute-0 python3[169407]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:06:25 compute-0 podman[169444]: 2025-12-08 20:06:25.244719075 +0000 UTC m=+0.052904913 container create c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:06:25 compute-0 podman[169444]: 2025-12-08 20:06:25.218757914 +0000 UTC m=+0.026943762 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 08 20:06:25 compute-0 python3[169407]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 08 20:06:25 compute-0 sudo[169405]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:25 compute-0 sudo[169632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkqhxrxjcrppvpneqzmagnlhimhaqwlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224385.6156495-541-32861192687427/AnsiballZ_stat.py'
Dec 08 20:06:25 compute-0 sudo[169632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:26 compute-0 python3.9[169634]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:06:26 compute-0 sudo[169632]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:26 compute-0 sudo[169786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zalihesgnbjuzuicozxikkpmvfgojtxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224386.6208029-550-80001476175152/AnsiballZ_file.py'
Dec 08 20:06:26 compute-0 sudo[169786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:27 compute-0 python3.9[169788]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:27 compute-0 sudo[169786]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:27 compute-0 podman[169836]: 2025-12-08 20:06:27.450787096 +0000 UTC m=+0.057306579 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 08 20:06:27 compute-0 sudo[169881]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxnhlauyybppfgmbzrjvuciyydqbjbqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224386.6208029-550-80001476175152/AnsiballZ_stat.py'
Dec 08 20:06:27 compute-0 sudo[169881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:27 compute-0 python3.9[169884]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:06:27 compute-0 sudo[169881]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:28 compute-0 sudo[170033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwuhlvykdzsgvvwqplrsromruztygkqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224387.7721658-550-218050707926994/AnsiballZ_copy.py'
Dec 08 20:06:28 compute-0 sudo[170033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:28 compute-0 python3.9[170035]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224387.7721658-550-218050707926994/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:28 compute-0 sudo[170033]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:28 compute-0 sudo[170109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grejhloniywtklgsxkbbzkugmxcatbjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224387.7721658-550-218050707926994/AnsiballZ_systemd.py'
Dec 08 20:06:28 compute-0 sudo[170109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:29 compute-0 python3.9[170111]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:06:29 compute-0 systemd[1]: Reloading.
Dec 08 20:06:29 compute-0 systemd-sysv-generator[170143]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:29 compute-0 systemd-rc-local-generator[170140]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:29 compute-0 sudo[170109]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:29 compute-0 sudo[170221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zghdchogszxdrvzclbmvebuiteoinrzz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224387.7721658-550-218050707926994/AnsiballZ_systemd.py'
Dec 08 20:06:29 compute-0 sudo[170221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:29 compute-0 python3.9[170223]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:30 compute-0 systemd[1]: Reloading.
Dec 08 20:06:30 compute-0 systemd-sysv-generator[170256]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:30 compute-0 systemd-rc-local-generator[170249]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:30 compute-0 systemd[1]: Starting multipathd container...
Dec 08 20:06:30 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:06:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e819bd622a50e57cc9f366dc89bf29b80f91dbe3fae226428ee2429eb321350/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 08 20:06:30 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e819bd622a50e57cc9f366dc89bf29b80f91dbe3fae226428ee2429eb321350/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 08 20:06:30 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.
Dec 08 20:06:30 compute-0 podman[170263]: 2025-12-08 20:06:30.489363275 +0000 UTC m=+0.147921698 container init c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 08 20:06:30 compute-0 multipathd[170279]: + sudo -E kolla_set_configs
Dec 08 20:06:30 compute-0 podman[170263]: 2025-12-08 20:06:30.520564299 +0000 UTC m=+0.179122562 container start c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 08 20:06:30 compute-0 podman[170263]: multipathd
Dec 08 20:06:30 compute-0 sudo[170286]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 08 20:06:30 compute-0 sudo[170286]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 08 20:06:30 compute-0 sudo[170286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 08 20:06:30 compute-0 systemd[1]: Started multipathd container.
Dec 08 20:06:30 compute-0 multipathd[170279]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:06:30 compute-0 multipathd[170279]: INFO:__main__:Validating config file
Dec 08 20:06:30 compute-0 multipathd[170279]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:06:30 compute-0 multipathd[170279]: INFO:__main__:Writing out command to execute
Dec 08 20:06:30 compute-0 sudo[170221]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:30 compute-0 sudo[170286]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:30 compute-0 multipathd[170279]: ++ cat /run_command
Dec 08 20:06:30 compute-0 multipathd[170279]: + CMD='/usr/sbin/multipathd -d'
Dec 08 20:06:30 compute-0 multipathd[170279]: + ARGS=
Dec 08 20:06:30 compute-0 multipathd[170279]: + sudo kolla_copy_cacerts
Dec 08 20:06:30 compute-0 sudo[170300]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 08 20:06:30 compute-0 sudo[170300]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 08 20:06:30 compute-0 sudo[170300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 08 20:06:30 compute-0 sudo[170300]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:30 compute-0 multipathd[170279]: + [[ ! -n '' ]]
Dec 08 20:06:30 compute-0 multipathd[170279]: + . kolla_extend_start
Dec 08 20:06:30 compute-0 multipathd[170279]: Running command: '/usr/sbin/multipathd -d'
Dec 08 20:06:30 compute-0 multipathd[170279]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 08 20:06:30 compute-0 multipathd[170279]: + umask 0022
Dec 08 20:06:30 compute-0 multipathd[170279]: + exec /usr/sbin/multipathd -d
Dec 08 20:06:30 compute-0 podman[170285]: 2025-12-08 20:06:30.630992556 +0000 UTC m=+0.089447284 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:06:30 compute-0 multipathd[170279]: 2928.297485 | --------start up--------
Dec 08 20:06:30 compute-0 multipathd[170279]: 2928.297504 | read /etc/multipath.conf
Dec 08 20:06:30 compute-0 systemd[1]: c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-19b7d69a4e0650a8.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:06:30 compute-0 systemd[1]: c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-19b7d69a4e0650a8.service: Failed with result 'exit-code'.
Dec 08 20:06:30 compute-0 multipathd[170279]: 2928.305135 | path checkers start up
Dec 08 20:06:31 compute-0 python3.9[170467]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:06:31 compute-0 sudo[170619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukzgtvmvayuufjeelerdaqvnbbdhkusl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224391.4755719-586-262929808079606/AnsiballZ_command.py'
Dec 08 20:06:31 compute-0 sudo[170619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:33 compute-0 podman[170622]: 2025-12-08 20:06:33.951884275 +0000 UTC m=+0.091644792 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Dec 08 20:06:33 compute-0 python3.9[170621]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:06:34 compute-0 sudo[170619]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:34 compute-0 sudo[170808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oekjuosetxdfrsdnysxlkdibtfjpvcxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224394.2406235-594-213901680485304/AnsiballZ_systemd.py'
Dec 08 20:06:34 compute-0 sudo[170808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:34 compute-0 python3.9[170810]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:06:34 compute-0 systemd[1]: Stopping multipathd container...
Dec 08 20:06:34 compute-0 multipathd[170279]: 2932.629677 | exit (signal)
Dec 08 20:06:34 compute-0 multipathd[170279]: 2932.630417 | --------shut down-------
Dec 08 20:06:34 compute-0 systemd[1]: libpod-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.scope: Deactivated successfully.
Dec 08 20:06:34 compute-0 podman[170814]: 2025-12-08 20:06:34.996957307 +0000 UTC m=+0.089792224 container died c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3)
Dec 08 20:06:35 compute-0 systemd[1]: c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-19b7d69a4e0650a8.timer: Deactivated successfully.
Dec 08 20:06:35 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.
Dec 08 20:06:35 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-userdata-shm.mount: Deactivated successfully.
Dec 08 20:06:35 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e819bd622a50e57cc9f366dc89bf29b80f91dbe3fae226428ee2429eb321350-merged.mount: Deactivated successfully.
Dec 08 20:06:35 compute-0 podman[170814]: 2025-12-08 20:06:35.061589104 +0000 UTC m=+0.154423991 container cleanup c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:06:35 compute-0 podman[170814]: multipathd
Dec 08 20:06:35 compute-0 podman[170841]: multipathd
Dec 08 20:06:35 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 08 20:06:35 compute-0 systemd[1]: Stopped multipathd container.
Dec 08 20:06:35 compute-0 systemd[1]: Starting multipathd container...
Dec 08 20:06:35 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:06:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e819bd622a50e57cc9f366dc89bf29b80f91dbe3fae226428ee2429eb321350/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 08 20:06:35 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e819bd622a50e57cc9f366dc89bf29b80f91dbe3fae226428ee2429eb321350/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 08 20:06:35 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.
Dec 08 20:06:35 compute-0 podman[170853]: 2025-12-08 20:06:35.343576186 +0000 UTC m=+0.152755150 container init c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 08 20:06:35 compute-0 multipathd[170870]: + sudo -E kolla_set_configs
Dec 08 20:06:35 compute-0 podman[170853]: 2025-12-08 20:06:35.37607415 +0000 UTC m=+0.185253074 container start c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 08 20:06:35 compute-0 podman[170853]: multipathd
Dec 08 20:06:35 compute-0 systemd[1]: Started multipathd container.
Dec 08 20:06:35 compute-0 sudo[170876]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 08 20:06:35 compute-0 sudo[170876]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 08 20:06:35 compute-0 sudo[170876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 08 20:06:35 compute-0 sudo[170808]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:35 compute-0 multipathd[170870]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:06:35 compute-0 multipathd[170870]: INFO:__main__:Validating config file
Dec 08 20:06:35 compute-0 multipathd[170870]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:06:35 compute-0 multipathd[170870]: INFO:__main__:Writing out command to execute
Dec 08 20:06:35 compute-0 sudo[170876]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:35 compute-0 multipathd[170870]: ++ cat /run_command
Dec 08 20:06:35 compute-0 multipathd[170870]: + CMD='/usr/sbin/multipathd -d'
Dec 08 20:06:35 compute-0 multipathd[170870]: + ARGS=
Dec 08 20:06:35 compute-0 multipathd[170870]: + sudo kolla_copy_cacerts
Dec 08 20:06:35 compute-0 sudo[170906]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 08 20:06:35 compute-0 sudo[170906]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Dec 08 20:06:35 compute-0 sudo[170906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Dec 08 20:06:35 compute-0 podman[170877]: 2025-12-08 20:06:35.495828019 +0000 UTC m=+0.098299430 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:06:35 compute-0 sudo[170906]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:35 compute-0 multipathd[170870]: + [[ ! -n '' ]]
Dec 08 20:06:35 compute-0 multipathd[170870]: + . kolla_extend_start
Dec 08 20:06:35 compute-0 multipathd[170870]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 08 20:06:35 compute-0 multipathd[170870]: Running command: '/usr/sbin/multipathd -d'
Dec 08 20:06:35 compute-0 multipathd[170870]: + umask 0022
Dec 08 20:06:35 compute-0 multipathd[170870]: + exec /usr/sbin/multipathd -d
Dec 08 20:06:35 compute-0 systemd[1]: c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-7492b4d9ca74fd4b.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:06:35 compute-0 systemd[1]: c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c-7492b4d9ca74fd4b.service: Failed with result 'exit-code'.
Dec 08 20:06:35 compute-0 multipathd[170870]: 2933.176546 | --------start up--------
Dec 08 20:06:35 compute-0 multipathd[170870]: 2933.176561 | read /etc/multipath.conf
Dec 08 20:06:35 compute-0 multipathd[170870]: 2933.182287 | path checkers start up
Dec 08 20:06:35 compute-0 sudo[171058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kksshakjqferivaohwmrktchncnijyni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224395.6244757-602-85272022486720/AnsiballZ_file.py'
Dec 08 20:06:35 compute-0 sudo[171058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:36 compute-0 python3.9[171060]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:36 compute-0 sudo[171058]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:36 compute-0 sudo[171210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihybjmwjzitfxgublzhbztrrvrnydwlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224396.4984345-614-24194792430696/AnsiballZ_file.py'
Dec 08 20:06:36 compute-0 sudo[171210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:37 compute-0 python3.9[171212]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 08 20:06:37 compute-0 sudo[171210]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:37 compute-0 sudo[171362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlthgytilfwkbdawtaufguefcivycebo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224397.2630892-622-64255393932455/AnsiballZ_modprobe.py'
Dec 08 20:06:37 compute-0 sudo[171362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:37 compute-0 python3.9[171364]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 08 20:06:37 compute-0 kernel: Key type psk registered
Dec 08 20:06:37 compute-0 sudo[171362]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:38 compute-0 sudo[171523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihqvnkhtneoaziqluzvctuyghqrprmcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224398.1356406-630-10729425057734/AnsiballZ_stat.py'
Dec 08 20:06:38 compute-0 sudo[171523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:38 compute-0 python3.9[171525]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:06:38 compute-0 sudo[171523]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:39 compute-0 sudo[171646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hkpihtwwdazaczcbnwngocobsvwgrcqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224398.1356406-630-10729425057734/AnsiballZ_copy.py'
Dec 08 20:06:39 compute-0 sudo[171646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:39 compute-0 python3.9[171648]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224398.1356406-630-10729425057734/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:39 compute-0 sudo[171646]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:39 compute-0 sudo[171798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqthjphnisypnxdezgmcisarpbqcwxye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224399.5888681-646-226122586973286/AnsiballZ_lineinfile.py'
Dec 08 20:06:39 compute-0 sudo[171798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:40 compute-0 python3.9[171800]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:40 compute-0 sudo[171798]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:40 compute-0 sudo[171950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exgbxfxgfzsepbcgzmrayregvyxrcecr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224400.2968943-654-107860813780468/AnsiballZ_systemd.py'
Dec 08 20:06:40 compute-0 sudo[171950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:40 compute-0 python3.9[171952]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:06:40 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 08 20:06:40 compute-0 systemd[1]: Stopped Load Kernel Modules.
Dec 08 20:06:40 compute-0 systemd[1]: Stopping Load Kernel Modules...
Dec 08 20:06:40 compute-0 systemd[1]: Starting Load Kernel Modules...
Dec 08 20:06:40 compute-0 systemd[1]: Finished Load Kernel Modules.
Dec 08 20:06:40 compute-0 sudo[171950]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:41 compute-0 sshd-session[171954]: Received disconnect from 172.190.42.55 port 52348:11: Bye Bye [preauth]
Dec 08 20:06:41 compute-0 sshd-session[171954]: Disconnected from authenticating user root 172.190.42.55 port 52348 [preauth]
Dec 08 20:06:41 compute-0 sudo[172108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utcdfvycyelkgksrprofhmlsikpelpby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224401.359763-662-76544928462423/AnsiballZ_dnf.py'
Dec 08 20:06:41 compute-0 sudo[172108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:41 compute-0 python3.9[172110]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 08 20:06:43 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 08 20:06:44 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 08 20:06:45 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 08 20:06:45 compute-0 systemd[1]: Reloading.
Dec 08 20:06:45 compute-0 systemd-rc-local-generator[172153]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:45 compute-0 systemd-sysv-generator[172157]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:45 compute-0 systemd[1]: Reloading.
Dec 08 20:06:45 compute-0 systemd-sysv-generator[172187]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:45 compute-0 systemd-rc-local-generator[172184]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:46 compute-0 systemd-logind[793]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 08 20:06:46 compute-0 systemd-logind[793]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 08 20:06:46 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 08 20:06:46 compute-0 systemd[1]: Starting man-db-cache-update.service...
Dec 08 20:06:46 compute-0 systemd[1]: Reloading.
Dec 08 20:06:46 compute-0 systemd-rc-local-generator[172280]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:46 compute-0 systemd-sysv-generator[172285]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:46 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 08 20:06:46 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 08 20:06:47 compute-0 sudo[172108]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:47 compute-0 sudo[173583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moopjvmppjtdmiobgqrmhjpiisxzpqcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224407.3079221-670-164840563674521/AnsiballZ_systemd_service.py'
Dec 08 20:06:47 compute-0 sudo[173583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:47 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 08 20:06:47 compute-0 systemd[1]: Finished man-db-cache-update.service.
Dec 08 20:06:47 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.420s CPU time.
Dec 08 20:06:47 compute-0 systemd[1]: run-r0658b3caff2f48059ac218da266fb63b.service: Deactivated successfully.
Dec 08 20:06:47 compute-0 python3.9[173585]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:06:47 compute-0 systemd[1]: Stopping Open-iSCSI...
Dec 08 20:06:47 compute-0 iscsid[161934]: iscsid shutting down.
Dec 08 20:06:47 compute-0 systemd[1]: iscsid.service: Deactivated successfully.
Dec 08 20:06:47 compute-0 systemd[1]: Stopped Open-iSCSI.
Dec 08 20:06:47 compute-0 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 08 20:06:48 compute-0 systemd[1]: Starting Open-iSCSI...
Dec 08 20:06:48 compute-0 systemd[1]: Started Open-iSCSI.
Dec 08 20:06:48 compute-0 sudo[173583]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:49 compute-0 python3.9[173740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:06:49 compute-0 sudo[173894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iszuqzbviywijtxcfphguqjtfgzfctdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224409.6475246-688-28120750437928/AnsiballZ_file.py'
Dec 08 20:06:49 compute-0 sudo[173894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:50 compute-0 python3.9[173896]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:06:50 compute-0 sudo[173894]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:50 compute-0 sudo[174046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykgsybxwakrgvvaddyhbxqdwjlnlmwnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224410.560697-699-127289385176572/AnsiballZ_systemd_service.py'
Dec 08 20:06:50 compute-0 sudo[174046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:51 compute-0 python3.9[174048]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:06:51 compute-0 systemd[1]: Reloading.
Dec 08 20:06:51 compute-0 systemd-sysv-generator[174074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:06:51 compute-0 systemd-rc-local-generator[174069]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:06:51 compute-0 sudo[174046]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:52 compute-0 python3.9[174233]: ansible-ansible.builtin.service_facts Invoked
Dec 08 20:06:52 compute-0 network[174250]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 20:06:52 compute-0 network[174251]: 'network-scripts' will be removed from distribution in near future.
Dec 08 20:06:52 compute-0 network[174252]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 20:06:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:06:54.978 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:06:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:06:54.979 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:06:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:06:54.979 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:06:57 compute-0 sudo[174540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksztvxtwxbyzjupedgszehxfwltxnnxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224417.271884-718-137164784531840/AnsiballZ_systemd_service.py'
Dec 08 20:06:57 compute-0 sudo[174540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:57 compute-0 podman[174498]: 2025-12-08 20:06:57.605335669 +0000 UTC m=+0.085628473 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 08 20:06:57 compute-0 python3.9[174546]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:57 compute-0 sudo[174540]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:58 compute-0 sudo[174697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygzagnxtjyzqilwcsjmrlnbyvytxhwhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224418.0944116-718-129086006763326/AnsiballZ_systemd_service.py'
Dec 08 20:06:58 compute-0 sudo[174697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:58 compute-0 python3.9[174699]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:58 compute-0 sudo[174697]: pam_unix(sudo:session): session closed for user root
Dec 08 20:06:59 compute-0 sudo[174850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jistijjcrzovrlziknxjjtoxbpovxktt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224418.9267268-718-105916061618156/AnsiballZ_systemd_service.py'
Dec 08 20:06:59 compute-0 sudo[174850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:06:59 compute-0 python3.9[174852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:06:59 compute-0 sudo[174850]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:00 compute-0 sudo[175003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkucehultsskomcyoesxnyyhgzdrvlrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224419.736871-718-21013986553652/AnsiballZ_systemd_service.py'
Dec 08 20:07:00 compute-0 sudo[175003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:00 compute-0 python3.9[175005]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:07:00 compute-0 sudo[175003]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:01 compute-0 sudo[175156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aasimwwxpunnbdyldpznpabxcejknwoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224420.7996094-718-130496507916468/AnsiballZ_systemd_service.py'
Dec 08 20:07:01 compute-0 sudo[175156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:01 compute-0 python3.9[175158]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:07:01 compute-0 sudo[175156]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:01 compute-0 sudo[175309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ueidagfjocurssiewbdphbameibgnpjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224421.6218529-718-269286763327753/AnsiballZ_systemd_service.py'
Dec 08 20:07:01 compute-0 sudo[175309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:02 compute-0 python3.9[175311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:07:02 compute-0 sudo[175309]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:02 compute-0 sudo[175462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxnneflnuatxaxuxmgbybyicgnjryrov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224422.4355335-718-277397411995916/AnsiballZ_systemd_service.py'
Dec 08 20:07:02 compute-0 sudo[175462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:03 compute-0 python3.9[175464]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:07:03 compute-0 sudo[175462]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:03 compute-0 sudo[175615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtatfuvdpzraaqdutxfdjxjglkijstrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224423.2555676-718-24124937607406/AnsiballZ_systemd_service.py'
Dec 08 20:07:03 compute-0 sudo[175615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:03 compute-0 python3.9[175617]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:07:03 compute-0 sudo[175615]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:04 compute-0 podman[175695]: 2025-12-08 20:07:04.565644363 +0000 UTC m=+0.124011833 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 08 20:07:04 compute-0 sudo[175794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fruezlgoxyrnmlgskxxbuhmhpdfpjiyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224424.3269882-777-183249329940036/AnsiballZ_file.py'
Dec 08 20:07:04 compute-0 sudo[175794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:04 compute-0 python3.9[175796]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:05 compute-0 sudo[175794]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:05 compute-0 sudo[175946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qenerqysiauveikooduvaoqnghqdhuvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224425.1890624-777-170240904465072/AnsiballZ_file.py'
Dec 08 20:07:05 compute-0 sudo[175946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:05 compute-0 podman[175948]: 2025-12-08 20:07:05.601033432 +0000 UTC m=+0.059772507 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 08 20:07:05 compute-0 python3.9[175949]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:05 compute-0 sudo[175946]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:06 compute-0 sudo[176118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-auqmxgjiylxdikatwjrvlhflwbxeaira ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224425.9309325-777-224445895756884/AnsiballZ_file.py'
Dec 08 20:07:06 compute-0 sudo[176118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:06 compute-0 python3.9[176120]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:06 compute-0 sudo[176118]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:06 compute-0 sudo[176270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brbpfszcnwpuejnvcpimmgziwshwakpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224426.689036-777-196186996891665/AnsiballZ_file.py'
Dec 08 20:07:06 compute-0 sudo[176270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:07 compute-0 python3.9[176272]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:07 compute-0 sudo[176270]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:07 compute-0 sudo[176422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjmdjupokaswjzxkyumevgvkjckgmloh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224427.3677564-777-36694148273381/AnsiballZ_file.py'
Dec 08 20:07:07 compute-0 sudo[176422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:07 compute-0 python3.9[176424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:07 compute-0 sudo[176422]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:08 compute-0 sudo[176574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgxlkmvvqqbuydoodmpexntbzclfpql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224428.0492241-777-89866915561030/AnsiballZ_file.py'
Dec 08 20:07:08 compute-0 sudo[176574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:08 compute-0 python3.9[176576]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:08 compute-0 sudo[176574]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:09 compute-0 sudo[176726]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhsnvzmqeohbeclutytrhjlxawbpanav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224428.7830226-777-275643782286505/AnsiballZ_file.py'
Dec 08 20:07:09 compute-0 sudo[176726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:09 compute-0 python3.9[176728]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:09 compute-0 sudo[176726]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:09 compute-0 sudo[176879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttrgqybjdckbsymuxralmjkmrpzizvaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224429.5034425-777-68390943901351/AnsiballZ_file.py'
Dec 08 20:07:09 compute-0 sudo[176879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:09 compute-0 python3.9[176881]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:09 compute-0 sudo[176879]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:10 compute-0 sudo[177031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twjotohifhmrbphrvaiokkthhpekeqfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224430.2008414-834-86874104978165/AnsiballZ_file.py'
Dec 08 20:07:10 compute-0 sudo[177031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:10 compute-0 python3.9[177033]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:10 compute-0 sudo[177031]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:11 compute-0 sudo[177183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zloqckdxwarwaltnugqczrruvnitsors ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224430.885097-834-67439292569141/AnsiballZ_file.py'
Dec 08 20:07:11 compute-0 sudo[177183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:11 compute-0 python3.9[177185]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:11 compute-0 sudo[177183]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:11 compute-0 sudo[177335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpqvhoyllepyiskhcbfrhjclxocngyck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224431.5754151-834-159556319264646/AnsiballZ_file.py'
Dec 08 20:07:11 compute-0 sudo[177335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:12 compute-0 python3.9[177337]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:12 compute-0 sudo[177335]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:12 compute-0 sudo[177487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbukseeqywqcnbbwrtbxxofcmbcyjbrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224432.2191715-834-181437333101935/AnsiballZ_file.py'
Dec 08 20:07:12 compute-0 sudo[177487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:12 compute-0 python3.9[177489]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:12 compute-0 sudo[177487]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:13 compute-0 sudo[177639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsqyblvxderuknfmzzofqlottqemsgnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224432.9049702-834-78164645148108/AnsiballZ_file.py'
Dec 08 20:07:13 compute-0 sudo[177639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:13 compute-0 python3.9[177641]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:13 compute-0 sudo[177639]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:14 compute-0 sudo[177791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sffqhuyxcodjrmaazimwqdvspcxnmjtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224433.6940725-834-113323986481747/AnsiballZ_file.py'
Dec 08 20:07:14 compute-0 sudo[177791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:14 compute-0 python3.9[177793]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:14 compute-0 sudo[177791]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:14 compute-0 sudo[177943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnzcxprqkdxtvcpxvruivkmkwcdgtjvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224434.4299414-834-4403908275956/AnsiballZ_file.py'
Dec 08 20:07:14 compute-0 sudo[177943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:14 compute-0 python3.9[177945]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:14 compute-0 sudo[177943]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:15 compute-0 sshd-session[176755]: Connection closed by 45.78.228.32 port 57556 [preauth]
Dec 08 20:07:15 compute-0 sudo[178096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xymruoqmpktipededvievacdaqaenugw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224435.1035898-834-35899765702199/AnsiballZ_file.py'
Dec 08 20:07:15 compute-0 sudo[178096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:15 compute-0 python3.9[178098]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:15 compute-0 sudo[178096]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:16 compute-0 sudo[178248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apmqrziksibjfyixtsoirnjhztwjhzuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224435.929525-892-100796519128969/AnsiballZ_command.py'
Dec 08 20:07:16 compute-0 sudo[178248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:16 compute-0 python3.9[178250]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:16 compute-0 sudo[178248]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:17 compute-0 python3.9[178402]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 20:07:18 compute-0 sudo[178552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmqhsmukcfiymtkpvikbwaaskpxzcsky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224437.8035235-910-171532222911302/AnsiballZ_systemd_service.py'
Dec 08 20:07:18 compute-0 sudo[178552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:18 compute-0 python3.9[178554]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:07:18 compute-0 systemd[1]: Reloading.
Dec 08 20:07:18 compute-0 systemd-sysv-generator[178585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:07:18 compute-0 systemd-rc-local-generator[178582]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:07:18 compute-0 sudo[178552]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:19 compute-0 sudo[178739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asrkjeygeukkixsiypmjedyayyzrbpcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224438.9441423-918-107743008255606/AnsiballZ_command.py'
Dec 08 20:07:19 compute-0 sudo[178739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:19 compute-0 python3.9[178741]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:19 compute-0 sudo[178739]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:20 compute-0 sudo[178892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-seylvqdksydfvskobadikvknohhqonpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224439.691888-918-189149734418039/AnsiballZ_command.py'
Dec 08 20:07:20 compute-0 sudo[178892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:20 compute-0 python3.9[178894]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:20 compute-0 sudo[178892]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:20 compute-0 sudo[179045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzplfpqrfioovjionyfjfgwayotkojkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224440.3648143-918-33127518564830/AnsiballZ_command.py'
Dec 08 20:07:20 compute-0 sudo[179045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:20 compute-0 python3.9[179047]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:20 compute-0 sudo[179045]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:21 compute-0 sudo[179198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pprwqzrubenytwptpikmuezyevgjjuce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224441.1548624-918-168407767870755/AnsiballZ_command.py'
Dec 08 20:07:21 compute-0 sudo[179198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:21 compute-0 python3.9[179200]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:21 compute-0 sudo[179198]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:22 compute-0 sudo[179351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifdjgeenkoorjmuszlhgrqezibsqbpbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224441.896651-918-149525768647974/AnsiballZ_command.py'
Dec 08 20:07:22 compute-0 sudo[179351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:22 compute-0 python3.9[179353]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:22 compute-0 sudo[179351]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:22 compute-0 sudo[179504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-taygfgwnadxemobhoavlcxcbaqdtzoyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224442.629896-918-245345460958663/AnsiballZ_command.py'
Dec 08 20:07:22 compute-0 sudo[179504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:23 compute-0 python3.9[179506]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:23 compute-0 sudo[179504]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:23 compute-0 sudo[179657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcfyxcqjxbteumoukzgugjlntfbxsgrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224443.236362-918-131814991415177/AnsiballZ_command.py'
Dec 08 20:07:23 compute-0 sudo[179657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:23 compute-0 python3.9[179659]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:23 compute-0 sudo[179657]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:24 compute-0 sudo[179810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahpbfhnonvvqmzyfmpaetjttmbzhhlzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224443.9122329-918-246925251309699/AnsiballZ_command.py'
Dec 08 20:07:24 compute-0 sudo[179810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:24 compute-0 python3.9[179812]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:07:24 compute-0 sudo[179810]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:25 compute-0 sudo[179963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-walrihavssdrezkebxbozurjsadwdccp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224445.618546-997-127140348573638/AnsiballZ_file.py'
Dec 08 20:07:25 compute-0 sudo[179963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:26 compute-0 python3.9[179965]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:26 compute-0 sudo[179963]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:26 compute-0 sudo[180115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aryzwkirvrlcghnnccsxgidgzbrtdizs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224446.302718-997-256069785357335/AnsiballZ_file.py'
Dec 08 20:07:26 compute-0 sudo[180115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:26 compute-0 python3.9[180117]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:26 compute-0 sudo[180115]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:27 compute-0 sudo[180267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfpzapszfxxwxwbmrunetvyfmkibjbbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224446.9993286-997-265567781550160/AnsiballZ_file.py'
Dec 08 20:07:27 compute-0 sudo[180267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:27 compute-0 python3.9[180269]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:27 compute-0 sudo[180267]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:28 compute-0 podman[180393]: 2025-12-08 20:07:28.004447781 +0000 UTC m=+0.059629064 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:07:28 compute-0 sudo[180434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rryzbclzlirmxmcvpwwcjwznmqruzwkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224447.7136345-1019-9742729870756/AnsiballZ_file.py'
Dec 08 20:07:28 compute-0 sudo[180434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:28 compute-0 python3.9[180440]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:28 compute-0 sudo[180434]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:28 compute-0 sudo[180590]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sksptduyxkgsfqnfyshcucvqraimmrhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224448.6136281-1019-117751156372167/AnsiballZ_file.py'
Dec 08 20:07:28 compute-0 sudo[180590]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:29 compute-0 python3.9[180592]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:29 compute-0 sudo[180590]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:29 compute-0 sudo[180742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzfjnubypookskjpaurlmvzmpyxuaaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224449.3005993-1019-148978812942989/AnsiballZ_file.py'
Dec 08 20:07:29 compute-0 sudo[180742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:29 compute-0 python3.9[180744]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:29 compute-0 sudo[180742]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:30 compute-0 sudo[180894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svmirvxjehavtcyrbfhzxiaxrgrguiie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224449.9734223-1019-273307392707010/AnsiballZ_file.py'
Dec 08 20:07:30 compute-0 sudo[180894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:30 compute-0 python3.9[180896]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:30 compute-0 sudo[180894]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:30 compute-0 sudo[181046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfhoudsgeiehbyhuamogrhpejeschcrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224450.6228962-1019-141771655744627/AnsiballZ_file.py'
Dec 08 20:07:30 compute-0 sudo[181046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:31 compute-0 python3.9[181048]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:31 compute-0 sudo[181046]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:31 compute-0 sudo[181198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgtwgycotzguxjclixuwwgynytodkcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224451.2770166-1019-184329704336664/AnsiballZ_file.py'
Dec 08 20:07:31 compute-0 sudo[181198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:31 compute-0 python3.9[181200]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:31 compute-0 sudo[181198]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:32 compute-0 sudo[181350]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxntlavbeifmnvfnlptoblngfrfcqluo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224452.218905-1019-20789153292743/AnsiballZ_file.py'
Dec 08 20:07:32 compute-0 sudo[181350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:32 compute-0 python3.9[181352]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:32 compute-0 sudo[181350]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:35 compute-0 podman[181377]: 2025-12-08 20:07:35.557488014 +0000 UTC m=+0.131400605 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:07:36 compute-0 podman[181403]: 2025-12-08 20:07:36.527440829 +0000 UTC m=+0.084802807 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:07:38 compute-0 sudo[181549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zemgxacsanwdmzraftmfgmglslwhgmem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224457.7136757-1188-106050674420876/AnsiballZ_getent.py'
Dec 08 20:07:38 compute-0 sudo[181549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:38 compute-0 python3.9[181551]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 08 20:07:38 compute-0 sudo[181549]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:38 compute-0 sudo[181702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjtvpcgpjpxxahlcniczsprvjqpszixm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224458.5714624-1196-59564940740511/AnsiballZ_group.py'
Dec 08 20:07:39 compute-0 sudo[181702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:39 compute-0 python3.9[181704]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 20:07:39 compute-0 groupadd[181705]: group added to /etc/group: name=nova, GID=42436
Dec 08 20:07:39 compute-0 groupadd[181705]: group added to /etc/gshadow: name=nova
Dec 08 20:07:39 compute-0 groupadd[181705]: new group: name=nova, GID=42436
Dec 08 20:07:39 compute-0 sudo[181702]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:40 compute-0 sudo[181860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnkcwqywowmixjnxswsnskcudgyiwjay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224459.4479828-1204-280136272676613/AnsiballZ_user.py'
Dec 08 20:07:40 compute-0 sudo[181860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:40 compute-0 python3.9[181862]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 08 20:07:40 compute-0 useradd[181864]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Dec 08 20:07:40 compute-0 useradd[181864]: add 'nova' to group 'libvirt'
Dec 08 20:07:40 compute-0 useradd[181864]: add 'nova' to shadow group 'libvirt'
Dec 08 20:07:40 compute-0 sudo[181860]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:41 compute-0 sshd-session[181895]: Accepted publickey for zuul from 192.168.122.30 port 33728 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:07:41 compute-0 systemd-logind[793]: New session 24 of user zuul.
Dec 08 20:07:41 compute-0 systemd[1]: Started Session 24 of User zuul.
Dec 08 20:07:41 compute-0 sshd-session[181895]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:07:41 compute-0 sshd-session[181898]: Received disconnect from 192.168.122.30 port 33728:11: disconnected by user
Dec 08 20:07:41 compute-0 sshd-session[181898]: Disconnected from user zuul 192.168.122.30 port 33728
Dec 08 20:07:41 compute-0 sshd-session[181895]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:07:41 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Dec 08 20:07:41 compute-0 systemd-logind[793]: Session 24 logged out. Waiting for processes to exit.
Dec 08 20:07:41 compute-0 systemd-logind[793]: Removed session 24.
Dec 08 20:07:42 compute-0 python3.9[182048]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:42 compute-0 python3.9[182169]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224461.7403152-1229-220526918283631/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:43 compute-0 python3.9[182319]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:43 compute-0 python3.9[182395]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:44 compute-0 python3.9[182545]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:44 compute-0 python3.9[182666]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224463.9327202-1229-162643749944436/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:45 compute-0 python3.9[182816]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:46 compute-0 python3.9[182937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224465.0829988-1229-191554163264140/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:46 compute-0 python3.9[183087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:47 compute-0 python3.9[183208]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224466.391225-1229-20327051556927/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:47 compute-0 python3.9[183358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:48 compute-0 python3.9[183479]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224467.4870777-1229-96600950235656/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:48 compute-0 sudo[183629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejjbwauwedyqpzicmisvwcryzutscblf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224468.679231-1312-114634749369383/AnsiballZ_file.py'
Dec 08 20:07:48 compute-0 sudo[183629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:49 compute-0 python3.9[183631]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:49 compute-0 sudo[183629]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:49 compute-0 sudo[183781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqifflelaenmhotocgtplxzgscahrfmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224469.3502154-1320-129225092732925/AnsiballZ_copy.py'
Dec 08 20:07:49 compute-0 sudo[183781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:50 compute-0 python3.9[183783]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:07:50 compute-0 sudo[183781]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:50 compute-0 sudo[183933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ythnrolrjbaemqckgcyhmifadjpdplhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224470.2904515-1328-177839889968070/AnsiballZ_stat.py'
Dec 08 20:07:50 compute-0 sudo[183933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:50 compute-0 python3.9[183935]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:07:50 compute-0 sudo[183933]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:51 compute-0 sudo[184085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfubggolaodgunbetadamzehmcfmjgjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224471.0708096-1336-2587815837121/AnsiballZ_stat.py'
Dec 08 20:07:51 compute-0 sudo[184085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:51 compute-0 python3.9[184087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:51 compute-0 sudo[184085]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:51 compute-0 sudo[184208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhnwpjwjyljfdlzdquqbpkcawijugrva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224471.0708096-1336-2587815837121/AnsiballZ_copy.py'
Dec 08 20:07:51 compute-0 sudo[184208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:52 compute-0 python3.9[184210]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765224471.0708096-1336-2587815837121/.source _original_basename=.5iq_nl70 follow=False checksum=a6c0e42a1335639ecdb001d3bb29945c47e368c0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 08 20:07:52 compute-0 sudo[184208]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:52 compute-0 python3.9[184362]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:07:53 compute-0 python3.9[184514]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:54 compute-0 python3.9[184635]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224473.2052753-1362-220322702098953/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:07:54.979 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:07:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:07:54.981 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:07:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:07:54.981 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:07:55 compute-0 python3.9[184785]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:07:55 compute-0 python3.9[184906]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224474.6535337-1377-4061321988068/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:07:56 compute-0 sudo[185056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eaueftrvgedrbnbxtwvzbhspvciusgkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224475.9824805-1394-38167168644687/AnsiballZ_container_config_data.py'
Dec 08 20:07:56 compute-0 sudo[185056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:56 compute-0 python3.9[185058]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 08 20:07:56 compute-0 sudo[185056]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:57 compute-0 sudo[185208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsrknglbwscktnfaqophzdhatelqqqwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224476.7683759-1403-62688754939530/AnsiballZ_container_config_hash.py'
Dec 08 20:07:57 compute-0 sudo[185208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:57 compute-0 python3.9[185210]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:07:57 compute-0 sudo[185208]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:57 compute-0 sudo[185360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-optzgfflzddinwdwqfbeghnwbagjnpob ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224477.7268867-1413-110561631485492/AnsiballZ_edpm_container_manage.py'
Dec 08 20:07:57 compute-0 sudo[185360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:58 compute-0 python3[185362]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:07:58 compute-0 podman[185398]: 2025-12-08 20:07:58.464055611 +0000 UTC m=+0.058058975 container create 92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:07:58 compute-0 podman[185398]: 2025-12-08 20:07:58.426869226 +0000 UTC m=+0.020872590 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 08 20:07:58 compute-0 python3[185362]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 08 20:07:58 compute-0 podman[185399]: 2025-12-08 20:07:58.494207209 +0000 UTC m=+0.061658208 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 08 20:07:58 compute-0 sudo[185360]: pam_unix(sudo:session): session closed for user root
Dec 08 20:07:59 compute-0 sudo[185607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpwruueexnexaglvbkejzynxqmktcfnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224478.8061674-1421-185706458902585/AnsiballZ_stat.py'
Dec 08 20:07:59 compute-0 sudo[185607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:07:59 compute-0 python3.9[185609]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:07:59 compute-0 sudo[185607]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:00 compute-0 sudo[185761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lisnmaxxhyuhwuheypianynixroizwgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224479.8322732-1433-89089883267479/AnsiballZ_container_config_data.py'
Dec 08 20:08:00 compute-0 sudo[185761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:00 compute-0 python3.9[185763]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 08 20:08:00 compute-0 sudo[185761]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:00 compute-0 sshd-session[185808]: Received disconnect from 172.190.42.55 port 58470:11: Bye Bye [preauth]
Dec 08 20:08:00 compute-0 sshd-session[185808]: Disconnected from authenticating user root 172.190.42.55 port 58470 [preauth]
Dec 08 20:08:00 compute-0 sudo[185915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elyevtizrihsgocwbqavgqyioejccooq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224480.5761626-1442-76871017097612/AnsiballZ_container_config_hash.py'
Dec 08 20:08:00 compute-0 sudo[185915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:01 compute-0 python3.9[185917]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:08:01 compute-0 sudo[185915]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:01 compute-0 sudo[186067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnjfakafmzmqpckdjghhgxfavjkyadbm ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224481.6589043-1452-265740218299943/AnsiballZ_edpm_container_manage.py'
Dec 08 20:08:01 compute-0 sudo[186067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:02 compute-0 python3[186069]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:08:02 compute-0 podman[186105]: 2025-12-08 20:08:02.383416132 +0000 UTC m=+0.028184177 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 08 20:08:02 compute-0 podman[186105]: 2025-12-08 20:08:02.535350264 +0000 UTC m=+0.180118289 container create 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 08 20:08:02 compute-0 python3[186069]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 08 20:08:02 compute-0 sudo[186067]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:03 compute-0 sudo[186292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frxhrtbtpdkkhonhdbzdfwudamadxhtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224482.8917606-1460-259240142738597/AnsiballZ_stat.py'
Dec 08 20:08:03 compute-0 sudo[186292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:03 compute-0 python3.9[186294]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:03 compute-0 sudo[186292]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:03 compute-0 sudo[186446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvdonfxbjbeskmvuaxqrtatyfxrdzmsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224483.6464086-1469-219911042456292/AnsiballZ_file.py'
Dec 08 20:08:03 compute-0 sudo[186446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:04 compute-0 python3.9[186448]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:04 compute-0 sudo[186446]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:04 compute-0 sudo[186597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iohuadgovjkakbfidrerakmailmzppmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224484.1720777-1469-81694393327280/AnsiballZ_copy.py'
Dec 08 20:08:04 compute-0 sudo[186597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:04 compute-0 python3.9[186599]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224484.1720777-1469-81694393327280/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:04 compute-0 sudo[186597]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:05 compute-0 sudo[186673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akpnwelpoybsnwvpzamtbyxlfwjpjdpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224484.1720777-1469-81694393327280/AnsiballZ_systemd.py'
Dec 08 20:08:05 compute-0 sudo[186673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:05 compute-0 python3.9[186675]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:08:05 compute-0 systemd[1]: Reloading.
Dec 08 20:08:05 compute-0 systemd-rc-local-generator[186704]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:08:05 compute-0 systemd-sysv-generator[186708]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:08:05 compute-0 sudo[186673]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:05 compute-0 podman[186712]: 2025-12-08 20:08:05.821223447 +0000 UTC m=+0.109830545 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 08 20:08:06 compute-0 sudo[186812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxcgejmuppuxyjoktqapsmthzcunbtdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224484.1720777-1469-81694393327280/AnsiballZ_systemd.py'
Dec 08 20:08:06 compute-0 sudo[186812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:06 compute-0 python3.9[186814]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:08:06 compute-0 systemd[1]: Reloading.
Dec 08 20:08:06 compute-0 systemd-rc-local-generator[186845]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:08:06 compute-0 systemd-sysv-generator[186848]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:08:06 compute-0 systemd[1]: Starting nova_compute container...
Dec 08 20:08:06 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:06 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:06 compute-0 podman[186854]: 2025-12-08 20:08:06.760764197 +0000 UTC m=+0.094393615 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:08:06 compute-0 podman[186856]: 2025-12-08 20:08:06.773339857 +0000 UTC m=+0.097054717 container init 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 08 20:08:06 compute-0 podman[186856]: 2025-12-08 20:08:06.783654638 +0000 UTC m=+0.107369448 container start 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Dec 08 20:08:06 compute-0 podman[186856]: nova_compute
Dec 08 20:08:06 compute-0 nova_compute[186889]: + sudo -E kolla_set_configs
Dec 08 20:08:06 compute-0 systemd[1]: Started nova_compute container.
Dec 08 20:08:06 compute-0 sudo[186812]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Validating config file
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying service configuration files
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Deleting /etc/ceph
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Creating directory /etc/ceph
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /etc/ceph
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Writing out command to execute
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:06 compute-0 nova_compute[186889]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 08 20:08:06 compute-0 nova_compute[186889]: ++ cat /run_command
Dec 08 20:08:06 compute-0 nova_compute[186889]: + CMD=nova-compute
Dec 08 20:08:06 compute-0 nova_compute[186889]: + ARGS=
Dec 08 20:08:06 compute-0 nova_compute[186889]: + sudo kolla_copy_cacerts
Dec 08 20:08:06 compute-0 nova_compute[186889]: + [[ ! -n '' ]]
Dec 08 20:08:06 compute-0 nova_compute[186889]: + . kolla_extend_start
Dec 08 20:08:06 compute-0 nova_compute[186889]: Running command: 'nova-compute'
Dec 08 20:08:06 compute-0 nova_compute[186889]: + echo 'Running command: '\''nova-compute'\'''
Dec 08 20:08:06 compute-0 nova_compute[186889]: + umask 0022
Dec 08 20:08:06 compute-0 nova_compute[186889]: + exec nova-compute
Dec 08 20:08:07 compute-0 python3.9[187050]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:08 compute-0 python3.9[187201]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:08 compute-0 nova_compute[186889]: 2025-12-08 20:08:08.963 186893 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:08 compute-0 nova_compute[186889]: 2025-12-08 20:08:08.963 186893 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:08 compute-0 nova_compute[186889]: 2025-12-08 20:08:08.964 186893 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:08 compute-0 nova_compute[186889]: 2025-12-08 20:08:08.964 186893 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 08 20:08:09 compute-0 nova_compute[186889]: 2025-12-08 20:08:09.148 186893 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:08:09 compute-0 nova_compute[186889]: 2025-12-08 20:08:09.164 186893 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.017s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:08:09 compute-0 nova_compute[186889]: 2025-12-08 20:08:09.165 186893 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 08 20:08:09 compute-0 python3.9[187353]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:10 compute-0 sudo[187505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqqkvapdgphlultfnatrpmqrsgrshupk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224489.5240498-1529-79329496907659/AnsiballZ_podman_container.py'
Dec 08 20:08:10 compute-0 sudo[187505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:10 compute-0 python3.9[187507]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 08 20:08:10 compute-0 sudo[187505]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:10 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 20:08:10 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 20:08:10 compute-0 nova_compute[186889]: 2025-12-08 20:08:10.830 186893 INFO nova.virt.driver [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 08 20:08:10 compute-0 nova_compute[186889]: 2025-12-08 20:08:10.963 186893 INFO nova.compute.provider_config [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.100 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.101 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.102 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.102 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.103 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.103 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.103 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.103 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.104 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.104 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.104 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.105 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.105 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.105 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.106 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.106 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.106 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.107 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.107 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.107 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.108 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.108 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.108 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.109 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.109 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.109 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.110 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.110 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.110 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.111 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 sudo[187680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mayxakvjtdoognpuqcsiexjtzvsodvby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224490.795136-1537-181985547365070/AnsiballZ_systemd.py'
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.111 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.111 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.112 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 sudo[187680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.113 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.114 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.114 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.115 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.115 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.115 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.116 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.116 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.116 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.117 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.117 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.117 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.118 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.118 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.118 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.119 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.119 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.119 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.119 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.120 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.120 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.120 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.121 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.121 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.121 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.122 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.122 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.122 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.123 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.123 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.123 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.124 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.124 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.124 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.124 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.125 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.125 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.125 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.126 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.126 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.126 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.127 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.127 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.127 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.128 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.128 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.128 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.129 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.129 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.129 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.130 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.130 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.130 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.131 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.131 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.131 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.132 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.132 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.132 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.133 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.133 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.133 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.133 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.134 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.134 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.134 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.135 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.135 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.135 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.136 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.136 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.136 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.137 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.137 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.137 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.138 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.138 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.138 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.139 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.139 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.139 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.139 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.140 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.140 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.140 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.141 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.141 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.141 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.142 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.142 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.142 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.142 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.142 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.143 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.143 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.143 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.143 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.143 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.144 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.144 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.144 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.144 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.144 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.145 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.145 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.145 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.145 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.146 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.146 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.146 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.146 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.146 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.147 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.147 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.147 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.147 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.147 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.148 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.148 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.148 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.148 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.148 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.149 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.149 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.149 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.149 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.150 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.150 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.150 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.150 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.151 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.151 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.151 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.151 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.151 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.152 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.152 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.152 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.152 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.152 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.153 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.153 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.153 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.153 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.154 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.154 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.154 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.154 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.154 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.155 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.155 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.155 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.155 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.156 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.156 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.156 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.156 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.156 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.157 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.157 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.157 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.158 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.158 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.158 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.158 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.158 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.159 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.159 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.159 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.159 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.160 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.160 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.160 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.160 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.161 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.161 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.161 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.161 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.162 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.162 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.162 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.162 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.162 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.163 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.163 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.163 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.163 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.163 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.164 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.164 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.164 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.164 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.164 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.165 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.165 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.165 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.165 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.165 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.166 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.166 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.166 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.166 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.166 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.167 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.167 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.167 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.167 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.168 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.168 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.168 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.168 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.168 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.169 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.169 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.169 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.169 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.170 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.170 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.170 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.170 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.170 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.171 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.171 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.171 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.171 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.171 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.172 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.172 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.172 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.172 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.173 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.173 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.173 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.173 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.173 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.174 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.174 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.174 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.174 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.174 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.175 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.175 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.175 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.175 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.176 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.176 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.176 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.176 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.177 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.177 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.177 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.177 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.178 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.178 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.178 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.178 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.178 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.179 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.180 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.181 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.181 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.181 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.181 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.181 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.182 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.183 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.184 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.184 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.184 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.184 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.184 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.185 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.186 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.187 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.188 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.189 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.189 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.189 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.189 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.189 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.190 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.190 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.190 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.190 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.191 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.192 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.193 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.193 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.193 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.193 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.194 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.195 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.196 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.197 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.198 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.199 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.200 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.201 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.202 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.203 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.204 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.205 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.206 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.207 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.208 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.209 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.209 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.209 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.209 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.210 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.211 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.211 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.211 186893 WARNING oslo_config.cfg [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 08 20:08:11 compute-0 nova_compute[186889]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 08 20:08:11 compute-0 nova_compute[186889]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 08 20:08:11 compute-0 nova_compute[186889]: and ``live_migration_inbound_addr`` respectively.
Dec 08 20:08:11 compute-0 nova_compute[186889]: ).  Its value may be silently ignored in the future.
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.211 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.212 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.212 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.212 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.212 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.212 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.213 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.213 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.213 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.213 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.213 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.214 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.215 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.216 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.217 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.217 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.217 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.217 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.217 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.218 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.219 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.219 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.219 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.219 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.219 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.220 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.221 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.222 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.223 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.224 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.225 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.226 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.227 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.228 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.229 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.229 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.229 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.229 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.229 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.230 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.231 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.232 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.233 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.234 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.235 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.236 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.236 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.236 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.236 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.236 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.237 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.238 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.239 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.240 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.240 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.240 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.240 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.241 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.242 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.243 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.243 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.243 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.243 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.243 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.244 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.245 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.246 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.247 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.248 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.249 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.250 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.251 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.251 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.251 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.251 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.251 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.252 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.253 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.253 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.253 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.253 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.253 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.254 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.255 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.255 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.255 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.255 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.255 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.256 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.256 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.256 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.256 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.256 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.257 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.257 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.257 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.257 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.257 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.258 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.259 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.259 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.259 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.259 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.260 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.260 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.260 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.260 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.260 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.261 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.261 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.261 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.261 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.261 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.262 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.262 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.262 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.262 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.262 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.263 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.263 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.263 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.263 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.263 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.264 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.265 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.265 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.265 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.265 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.265 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.266 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.266 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.266 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.266 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.266 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.267 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.267 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.267 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.267 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.267 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.268 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.268 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.268 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.268 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.268 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.269 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.269 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.269 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.269 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.269 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.270 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.271 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.271 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.271 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.271 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.272 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.273 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.274 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.275 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.276 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.277 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.277 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.277 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.277 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.277 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.278 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.279 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.280 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.281 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.282 186893 DEBUG oslo_service.service [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.284 186893 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 08 20:08:11 compute-0 python3.9[187682]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:08:11 compute-0 systemd[1]: Stopping nova_compute container...
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.496 186893 DEBUG nova.virt.libvirt.host [None req-f12641b1-c31b-4979-98e4-7d4c91af1941 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.498 186893 DEBUG nova.virt.libvirt.host [None req-f12641b1-c31b-4979-98e4-7d4c91af1941 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.498 186893 DEBUG nova.virt.libvirt.host [None req-f12641b1-c31b-4979-98e4-7d4c91af1941 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.499 186893 DEBUG nova.virt.libvirt.host [None req-f12641b1-c31b-4979-98e4-7d4c91af1941 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.508 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.508 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:08:11 compute-0 nova_compute[186889]: 2025-12-08 20:08:11.508 186893 DEBUG oslo_concurrency.lockutils [None req-1fe37b05-67c1-45e7-8f86-547df03b8c34 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:08:11 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Dec 08 20:08:11 compute-0 systemd[1]: Started libvirt QEMU daemon.
Dec 08 20:08:12 compute-0 systemd[1]: libpod-549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153.scope: Deactivated successfully.
Dec 08 20:08:12 compute-0 systemd[1]: libpod-549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153.scope: Consumed 3.316s CPU time.
Dec 08 20:08:12 compute-0 podman[187686]: 2025-12-08 20:08:12.093743001 +0000 UTC m=+0.628144763 container died 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Dec 08 20:08:12 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153-userdata-shm.mount: Deactivated successfully.
Dec 08 20:08:12 compute-0 systemd[1]: var-lib-containers-storage-overlay-d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c-merged.mount: Deactivated successfully.
Dec 08 20:08:12 compute-0 podman[187686]: 2025-12-08 20:08:12.163722946 +0000 UTC m=+0.698124658 container cleanup 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 08 20:08:12 compute-0 podman[187686]: nova_compute
Dec 08 20:08:12 compute-0 podman[187758]: nova_compute
Dec 08 20:08:12 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 08 20:08:12 compute-0 systemd[1]: Stopped nova_compute container.
Dec 08 20:08:12 compute-0 systemd[1]: Starting nova_compute container...
Dec 08 20:08:12 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:12 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d86052cb4334a85ff046ed9b20e67a74553614af714f56b594498945a7f8340c/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:12 compute-0 podman[187771]: 2025-12-08 20:08:12.386711926 +0000 UTC m=+0.109032389 container init 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 08 20:08:12 compute-0 podman[187771]: 2025-12-08 20:08:12.402395364 +0000 UTC m=+0.124715797 container start 549ab95d0e2f0663b47f86faa92785f36e93697171582ba77135d4d83dc55153 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3)
Dec 08 20:08:12 compute-0 podman[187771]: nova_compute
Dec 08 20:08:12 compute-0 nova_compute[187787]: + sudo -E kolla_set_configs
Dec 08 20:08:12 compute-0 systemd[1]: Started nova_compute container.
Dec 08 20:08:12 compute-0 sudo[187680]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Validating config file
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying service configuration files
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /etc/ceph
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Creating directory /etc/ceph
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /etc/ceph
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Writing out command to execute
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:12 compute-0 nova_compute[187787]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 08 20:08:12 compute-0 nova_compute[187787]: ++ cat /run_command
Dec 08 20:08:12 compute-0 nova_compute[187787]: + CMD=nova-compute
Dec 08 20:08:12 compute-0 nova_compute[187787]: + ARGS=
Dec 08 20:08:12 compute-0 nova_compute[187787]: + sudo kolla_copy_cacerts
Dec 08 20:08:12 compute-0 nova_compute[187787]: + [[ ! -n '' ]]
Dec 08 20:08:12 compute-0 nova_compute[187787]: + . kolla_extend_start
Dec 08 20:08:12 compute-0 nova_compute[187787]: Running command: 'nova-compute'
Dec 08 20:08:12 compute-0 nova_compute[187787]: + echo 'Running command: '\''nova-compute'\'''
Dec 08 20:08:12 compute-0 nova_compute[187787]: + umask 0022
Dec 08 20:08:12 compute-0 nova_compute[187787]: + exec nova-compute
Dec 08 20:08:12 compute-0 sudo[187952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agrhqyppcqolbtmymvqvfepqdlqfxdzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224492.6507206-1546-184822115856957/AnsiballZ_podman_container.py'
Dec 08 20:08:12 compute-0 sudo[187952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:12 compute-0 sshd-session[187794]: Received disconnect from 172.96.182.111 port 53722:11: Bye Bye [preauth]
Dec 08 20:08:12 compute-0 sshd-session[187794]: Disconnected from authenticating user root 172.96.182.111 port 53722 [preauth]
Dec 08 20:08:13 compute-0 python3.9[187954]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 08 20:08:13 compute-0 systemd[1]: Started libpod-conmon-92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f.scope.
Dec 08 20:08:13 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:08:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3cdb1203989271ffc742c17a4748ed876164121d76c57769643cbce0b791027/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3cdb1203989271ffc742c17a4748ed876164121d76c57769643cbce0b791027/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c3cdb1203989271ffc742c17a4748ed876164121d76c57769643cbce0b791027/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 08 20:08:13 compute-0 podman[187980]: 2025-12-08 20:08:13.507930563 +0000 UTC m=+0.159165037 container init 92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 08 20:08:13 compute-0 podman[187980]: 2025-12-08 20:08:13.516783109 +0000 UTC m=+0.168017553 container start 92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 08 20:08:13 compute-0 python3.9[187954]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Applying nova statedir ownership
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 08 20:08:13 compute-0 nova_compute_init[188002]: INFO:nova_statedir:Nova statedir ownership complete
Dec 08 20:08:13 compute-0 systemd[1]: libpod-92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f.scope: Deactivated successfully.
Dec 08 20:08:13 compute-0 podman[188013]: 2025-12-08 20:08:13.617175289 +0000 UTC m=+0.028087044 container died 92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 08 20:08:13 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f-userdata-shm.mount: Deactivated successfully.
Dec 08 20:08:13 compute-0 systemd[1]: var-lib-containers-storage-overlay-c3cdb1203989271ffc742c17a4748ed876164121d76c57769643cbce0b791027-merged.mount: Deactivated successfully.
Dec 08 20:08:13 compute-0 podman[188013]: 2025-12-08 20:08:13.686168843 +0000 UTC m=+0.097080518 container cleanup 92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:08:13 compute-0 sudo[187952]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:13 compute-0 systemd[1]: libpod-conmon-92846656fc95b24fdc85415530a3efb12232aff121aa2c87562cae5dcb51e43f.scope: Deactivated successfully.
Dec 08 20:08:14 compute-0 sshd-session[187874]: Invalid user httpd from 103.172.28.62 port 43954
Dec 08 20:08:14 compute-0 sshd-session[159670]: Connection closed by 192.168.122.30 port 37594
Dec 08 20:08:14 compute-0 sshd-session[159667]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:08:14 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Dec 08 20:08:14 compute-0 systemd[1]: session-23.scope: Consumed 1min 59.575s CPU time.
Dec 08 20:08:14 compute-0 systemd-logind[793]: Session 23 logged out. Waiting for processes to exit.
Dec 08 20:08:14 compute-0 systemd-logind[793]: Removed session 23.
Dec 08 20:08:14 compute-0 sshd-session[187874]: Received disconnect from 103.172.28.62 port 43954:11: Bye Bye [preauth]
Dec 08 20:08:14 compute-0 sshd-session[187874]: Disconnected from invalid user httpd 103.172.28.62 port 43954 [preauth]
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.593 187791 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.593 187791 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.593 187791 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.593 187791 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.744 187791 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.769 187791 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:08:14 compute-0 nova_compute[187787]: 2025-12-08 20:08:14.770 187791 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.224 187791 INFO nova.virt.driver [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.326 187791 INFO nova.compute.provider_config [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.344 187791 DEBUG oslo_concurrency.lockutils [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.345 187791 DEBUG oslo_concurrency.lockutils [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.345 187791 DEBUG oslo_concurrency.lockutils [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.345 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.346 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.347 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.348 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.349 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.349 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.349 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.349 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.349 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.350 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.351 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.352 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.352 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.352 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.352 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.352 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.353 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.354 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.354 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.354 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.354 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.354 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.355 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.356 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.357 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.358 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.359 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.360 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.360 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.360 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.360 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.361 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.361 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.361 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.361 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.362 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.362 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.362 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.362 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.362 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.363 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.363 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.363 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.363 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.363 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.364 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.364 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.364 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.364 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.364 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.365 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.365 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.365 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.365 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.365 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.366 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.366 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.366 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.366 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.367 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.367 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.367 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.367 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.367 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.368 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.369 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.369 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.369 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.369 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.370 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.371 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.371 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.371 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.371 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.371 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.372 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.372 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.372 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.372 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.372 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.373 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.373 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.373 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.373 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.374 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.374 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.374 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.374 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.374 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.375 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.375 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.375 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.375 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.375 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.376 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.376 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.376 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.376 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.377 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.377 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.377 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.377 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.377 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.378 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.378 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.378 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.378 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.378 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.379 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.379 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.379 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.379 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.379 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.380 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.380 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.380 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.380 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.381 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.382 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.382 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.382 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.382 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.382 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.383 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.383 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.383 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.383 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.383 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.384 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.384 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.384 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.384 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.385 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.385 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.385 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.385 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.385 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.386 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.386 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.386 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.386 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.386 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.387 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.388 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.388 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.388 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.388 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.388 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.389 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.389 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.389 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.389 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.390 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.390 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.390 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.390 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.390 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.391 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.391 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.391 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.391 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.392 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.392 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.392 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.392 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.392 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.393 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.393 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.393 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.393 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.393 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.394 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.394 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.394 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.394 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.394 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.395 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.395 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.395 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.395 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.395 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.396 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.396 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.396 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.397 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.397 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.397 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.397 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.397 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.398 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.398 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.398 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.398 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.398 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.399 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.399 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.399 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.399 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.399 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.400 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.401 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.401 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.401 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.401 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.401 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.402 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.402 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.402 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.402 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.402 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.403 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.403 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.403 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.403 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.403 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.404 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.404 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.404 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.404 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.404 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.405 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.405 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.405 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.405 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.405 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.406 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.406 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.406 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.406 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.406 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.407 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.407 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.407 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.407 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.407 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.408 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.408 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.408 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.408 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.408 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.409 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.409 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.409 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.409 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.410 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.410 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.410 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.410 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.410 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.411 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.412 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.412 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.412 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.412 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.413 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.413 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.413 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.413 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.413 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.414 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.414 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.414 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.414 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.414 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.415 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.415 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.415 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.415 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.416 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.416 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.416 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.416 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.416 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.417 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.417 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.417 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.417 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.418 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.419 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.419 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.419 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.419 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.419 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.420 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.420 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.420 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.420 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.420 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.421 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.421 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.421 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.421 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.421 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.422 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.422 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.422 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.422 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.422 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.423 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.423 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.423 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.423 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.424 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.424 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.424 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.424 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.424 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.425 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.426 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.427 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.427 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.427 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.427 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.427 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.428 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.428 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.428 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.428 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.428 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.429 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.429 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.429 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.429 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.429 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.430 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.430 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.430 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.430 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.430 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.431 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.431 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.431 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.431 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.431 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.432 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.433 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.433 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.433 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.433 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.433 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.434 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.435 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.436 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.437 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.438 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.439 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.439 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.439 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.439 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.439 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.440 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.440 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.440 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.440 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.440 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 WARNING oslo_config.cfg [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 08 20:08:15 compute-0 nova_compute[187787]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 08 20:08:15 compute-0 nova_compute[187787]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 08 20:08:15 compute-0 nova_compute[187787]: and ``live_migration_inbound_addr`` respectively.
Dec 08 20:08:15 compute-0 nova_compute[187787]: ).  Its value may be silently ignored in the future.
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.441 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.442 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.442 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.442 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.442 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.442 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.443 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.443 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.443 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.443 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.443 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.444 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.445 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.446 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.447 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.448 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.449 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.452 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.453 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.453 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.453 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.453 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.454 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.455 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.456 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.457 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.457 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.457 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.457 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.457 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.458 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.458 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.458 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.458 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.459 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.459 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.459 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.459 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.459 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.460 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.460 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.460 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.460 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.460 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.461 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.461 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.461 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.461 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.462 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.462 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.462 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.462 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.463 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.464 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.465 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.466 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.467 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.468 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.469 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.470 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.470 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.470 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.470 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.470 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.471 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.471 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.471 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.471 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.471 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.472 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.473 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.473 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.473 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.473 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.473 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.474 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.475 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.476 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.477 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.478 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.478 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.478 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.478 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.478 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.479 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.479 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.479 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.479 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.479 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.480 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.481 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.482 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.483 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.483 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.483 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.483 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.483 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.484 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.485 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.485 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.485 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.485 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.485 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.486 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.486 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.486 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.486 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.487 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.487 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.487 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.487 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.488 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.488 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.488 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.488 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.489 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.489 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.489 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.489 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.489 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.490 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.490 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.490 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.490 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.490 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.491 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.491 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.491 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.491 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.491 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.492 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.492 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.492 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.492 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.492 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.493 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.493 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.493 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.493 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.493 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.494 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.494 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.494 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.494 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.494 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.495 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.496 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.496 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.496 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.496 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.496 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.497 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.498 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.498 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.498 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.498 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.498 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.499 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.499 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.499 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.499 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.499 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.500 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.500 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.500 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.500 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.500 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.501 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.501 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.501 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.501 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.501 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.502 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.502 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.502 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.502 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.502 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.503 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.503 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.503 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.503 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.503 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.504 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.504 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.504 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.504 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.504 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.505 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.505 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.505 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.505 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.506 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.506 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.506 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.506 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.506 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.507 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.507 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.507 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.507 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.507 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.508 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.509 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.509 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.509 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.509 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.509 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.510 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.510 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.510 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.510 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.510 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.511 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.511 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.511 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.511 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.511 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.512 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.513 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.513 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.513 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.513 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.513 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.514 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.514 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.514 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.514 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.514 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.515 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.515 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.515 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.515 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.515 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.516 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.516 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.516 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.516 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.516 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.517 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.517 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.517 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.517 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.517 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.518 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.519 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.519 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.519 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.519 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.519 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.520 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.520 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.520 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.520 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.520 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.521 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.521 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.521 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.521 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.521 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.522 187791 DEBUG oslo_service.service [None req-7c6f6b65-d831-43e6-8dba-4b9ba73058a2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.523 187791 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.548 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.549 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.550 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.550 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.564 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fd9ee1c7b50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.566 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fd9ee1c7b50> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.567 187791 INFO nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Connection event '1' reason 'None'
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.587 187791 WARNING nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 08 20:08:15 compute-0 nova_compute[187787]: 2025-12-08 20:08:15.587 187791 DEBUG nova.virt.libvirt.volume.mount [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.383 187791 INFO nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Libvirt host capabilities <capabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]: 
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <host>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <uuid>58f8fcaa-a5ac-48a3-a561-edc106bffe35</uuid>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <arch>x86_64</arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model>EPYC-Rome-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <vendor>AMD</vendor>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <microcode version='16777317'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <signature family='23' model='49' stepping='0'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <maxphysaddr mode='emulate' bits='40'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='x2apic'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='tsc-deadline'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='osxsave'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='hypervisor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='tsc_adjust'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='spec-ctrl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='stibp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='arch-capabilities'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='cmp_legacy'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='topoext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='virt-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='lbrv'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='tsc-scale'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='vmcb-clean'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='pause-filter'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='pfthreshold'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='svme-addr-chk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='rdctl-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='skip-l1dfl-vmentry'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='mds-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature name='pschange-mc-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <pages unit='KiB' size='4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <pages unit='KiB' size='2048'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <pages unit='KiB' size='1048576'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <power_management>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <suspend_mem/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <suspend_disk/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <suspend_hybrid/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </power_management>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <iommu support='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <migration_features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <live/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <uri_transports>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <uri_transport>tcp</uri_transport>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <uri_transport>rdma</uri_transport>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </uri_transports>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </migration_features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <topology>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <cells num='1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <cell id='0'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <memory unit='KiB'>7864320</memory>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <pages unit='KiB' size='4'>1966080</pages>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <pages unit='KiB' size='2048'>0</pages>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <pages unit='KiB' size='1048576'>0</pages>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <distances>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <sibling id='0' value='10'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           </distances>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           <cpus num='8'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:           </cpus>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         </cell>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </cells>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </topology>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <cache>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </cache>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <secmodel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model>selinux</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <doi>0</doi>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </secmodel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <secmodel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model>dac</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <doi>0</doi>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <baselabel type='kvm'>+107:+107</baselabel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <baselabel type='qemu'>+107:+107</baselabel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </secmodel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </host>
Dec 08 20:08:16 compute-0 nova_compute[187787]: 
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <guest>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <os_type>hvm</os_type>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <arch name='i686'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <wordsize>32</wordsize>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <domain type='qemu'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <domain type='kvm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <pae/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <nonpae/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <acpi default='on' toggle='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <apic default='on' toggle='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <cpuselection/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <deviceboot/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <disksnapshot default='on' toggle='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <externalSnapshot/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </guest>
Dec 08 20:08:16 compute-0 nova_compute[187787]: 
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <guest>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <os_type>hvm</os_type>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <arch name='x86_64'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <wordsize>64</wordsize>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <domain type='qemu'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <domain type='kvm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <acpi default='on' toggle='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <apic default='on' toggle='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <cpuselection/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <deviceboot/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <disksnapshot default='on' toggle='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <externalSnapshot/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </guest>
Dec 08 20:08:16 compute-0 nova_compute[187787]: 
Dec 08 20:08:16 compute-0 nova_compute[187787]: </capabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]: 
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.389 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.418 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 08 20:08:16 compute-0 nova_compute[187787]: <domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <path>/usr/libexec/qemu-kvm</path>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <domain>kvm</domain>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <arch>i686</arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <vcpu max='240'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <iothreads supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <os supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='firmware'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <loader supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>rom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pflash</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='readonly'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>yes</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='secure'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </loader>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </os>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-passthrough' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='hostPassthroughMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='maximum' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='maximumMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-model' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <vendor>AMD</vendor>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='x2apic'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-deadline'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='hypervisor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc_adjust'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='spec-ctrl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='stibp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='cmp_legacy'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='overflow-recov'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='succor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='amd-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='virt-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lbrv'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-scale'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='vmcb-clean'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='flushbyasid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pause-filter'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pfthreshold'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='svme-addr-chk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='disable' name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='custom' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Dhyana-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-128'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-256'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-512'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v6'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v7'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <memoryBacking supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='sourceType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>anonymous</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>memfd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </memoryBacking>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <disk supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='diskDevice'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>disk</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cdrom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>floppy</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>lun</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ide</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>fdc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>sata</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <graphics supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vnc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egl-headless</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </graphics>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <video supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='modelType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vga</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cirrus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>none</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>bochs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ramfb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </video>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hostdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='mode'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>subsystem</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='startupPolicy'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>mandatory</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>requisite</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>optional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='subsysType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pci</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='capsType'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='pciBackend'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hostdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <rng supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>random</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <filesystem supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='driverType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>path</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>handle</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtiofs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </filesystem>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <tpm supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-tis</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-crb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emulator</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>external</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendVersion'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>2.0</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </tpm>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <redirdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </redirdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <channel supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </channel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <crypto supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </crypto>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <interface supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>passt</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <panic supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>isa</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>hyperv</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </panic>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <console supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>null</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dev</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pipe</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stdio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>udp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tcp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu-vdagent</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </console>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <gic supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <vmcoreinfo supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <genid supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backingStoreInput supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backup supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <async-teardown supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <ps2 supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sev supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sgx supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hyperv supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='features'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>relaxed</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vapic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>spinlocks</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vpindex</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>runtime</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>synic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stimer</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reset</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vendor_id</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>frequencies</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reenlightenment</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tlbflush</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ipi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>avic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emsr_bitmap</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>xmm_input</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <spinlocks>4095</spinlocks>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <stimer_direct>on</stimer_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_direct>on</tlbflush_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_extended>on</tlbflush_extended>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hyperv>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <launchSecurity supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='sectype'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tdx</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </launchSecurity>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]: </domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.426 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 08 20:08:16 compute-0 nova_compute[187787]: <domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <path>/usr/libexec/qemu-kvm</path>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <domain>kvm</domain>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <arch>i686</arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <vcpu max='4096'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <iothreads supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <os supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='firmware'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <loader supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>rom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pflash</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='readonly'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>yes</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='secure'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </loader>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </os>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-passthrough' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='hostPassthroughMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='maximum' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='maximumMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-model' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <vendor>AMD</vendor>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='x2apic'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-deadline'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='hypervisor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc_adjust'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='spec-ctrl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='stibp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='cmp_legacy'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='overflow-recov'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='succor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='amd-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='virt-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lbrv'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-scale'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='vmcb-clean'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='flushbyasid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pause-filter'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pfthreshold'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='svme-addr-chk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='disable' name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='custom' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Dhyana-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-128'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-256'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-512'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v6'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v7'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <memoryBacking supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='sourceType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>anonymous</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>memfd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </memoryBacking>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <disk supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='diskDevice'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>disk</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cdrom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>floppy</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>lun</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>fdc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>sata</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <graphics supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vnc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egl-headless</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </graphics>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <video supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='modelType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vga</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cirrus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>none</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>bochs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ramfb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </video>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hostdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='mode'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>subsystem</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='startupPolicy'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>mandatory</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>requisite</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>optional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='subsysType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pci</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='capsType'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='pciBackend'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hostdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <rng supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>random</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <filesystem supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='driverType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>path</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>handle</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtiofs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </filesystem>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <tpm supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-tis</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-crb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emulator</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>external</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendVersion'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>2.0</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </tpm>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <redirdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </redirdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <channel supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </channel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <crypto supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </crypto>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <interface supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>passt</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <panic supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>isa</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>hyperv</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </panic>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <console supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>null</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dev</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pipe</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stdio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>udp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tcp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu-vdagent</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </console>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <gic supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <vmcoreinfo supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <genid supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backingStoreInput supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backup supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <async-teardown supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <ps2 supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sev supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sgx supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hyperv supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='features'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>relaxed</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vapic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>spinlocks</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vpindex</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>runtime</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>synic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stimer</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reset</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vendor_id</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>frequencies</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reenlightenment</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tlbflush</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ipi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>avic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emsr_bitmap</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>xmm_input</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <spinlocks>4095</spinlocks>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <stimer_direct>on</stimer_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_direct>on</tlbflush_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_extended>on</tlbflush_extended>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hyperv>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <launchSecurity supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='sectype'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tdx</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </launchSecurity>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]: </domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.453 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.457 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 08 20:08:16 compute-0 nova_compute[187787]: <domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <path>/usr/libexec/qemu-kvm</path>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <domain>kvm</domain>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <machine>pc-i440fx-rhel7.6.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <arch>x86_64</arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <vcpu max='240'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <iothreads supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <os supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='firmware'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <loader supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>rom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pflash</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='readonly'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>yes</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='secure'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </loader>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </os>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-passthrough' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='hostPassthroughMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='maximum' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='maximumMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-model' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <vendor>AMD</vendor>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='x2apic'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-deadline'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='hypervisor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc_adjust'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='spec-ctrl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='stibp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='cmp_legacy'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='overflow-recov'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='succor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='amd-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='virt-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lbrv'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-scale'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='vmcb-clean'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='flushbyasid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pause-filter'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pfthreshold'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='svme-addr-chk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='disable' name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='custom' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Dhyana-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-128'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-256'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-512'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v6'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v7'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <memoryBacking supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='sourceType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>anonymous</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>memfd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </memoryBacking>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <disk supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='diskDevice'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>disk</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cdrom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>floppy</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>lun</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ide</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>fdc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>sata</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <graphics supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vnc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egl-headless</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </graphics>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <video supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='modelType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vga</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cirrus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>none</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>bochs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ramfb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </video>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hostdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='mode'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>subsystem</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='startupPolicy'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>mandatory</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>requisite</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>optional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='subsysType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pci</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='capsType'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='pciBackend'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hostdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <rng supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>random</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <filesystem supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='driverType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>path</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>handle</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtiofs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </filesystem>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <tpm supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-tis</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-crb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emulator</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>external</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendVersion'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>2.0</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </tpm>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <redirdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </redirdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <channel supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </channel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <crypto supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </crypto>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <interface supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>passt</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <panic supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>isa</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>hyperv</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </panic>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <console supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>null</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dev</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pipe</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stdio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>udp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tcp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu-vdagent</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </console>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <gic supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <vmcoreinfo supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <genid supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backingStoreInput supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backup supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <async-teardown supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <ps2 supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sev supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sgx supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hyperv supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='features'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>relaxed</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vapic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>spinlocks</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vpindex</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>runtime</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>synic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stimer</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reset</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vendor_id</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>frequencies</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reenlightenment</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tlbflush</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ipi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>avic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emsr_bitmap</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>xmm_input</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <spinlocks>4095</spinlocks>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <stimer_direct>on</stimer_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_direct>on</tlbflush_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_extended>on</tlbflush_extended>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hyperv>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <launchSecurity supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='sectype'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tdx</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </launchSecurity>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]: </domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.517 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 08 20:08:16 compute-0 nova_compute[187787]: <domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <path>/usr/libexec/qemu-kvm</path>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <domain>kvm</domain>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <machine>pc-q35-rhel9.8.0</machine>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <arch>x86_64</arch>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <vcpu max='4096'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <iothreads supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <os supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='firmware'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>efi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <loader supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>rom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pflash</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='readonly'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>yes</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='secure'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>yes</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>no</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </loader>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </os>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-passthrough' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='hostPassthroughMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='maximum' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='maximumMigratable'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>on</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>off</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='host-model' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model fallback='forbid'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <vendor>AMD</vendor>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <maxphysaddr mode='passthrough' limit='40'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='x2apic'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-deadline'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='hypervisor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc_adjust'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='spec-ctrl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='stibp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='cmp_legacy'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='overflow-recov'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='succor'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='amd-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='virt-ssbd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lbrv'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='tsc-scale'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='vmcb-clean'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='flushbyasid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pause-filter'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='pfthreshold'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='svme-addr-chk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='require' name='lfence-always-serializing'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <feature policy='disable' name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <mode name='custom' supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Broadwell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cascadelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Cooperlake-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Denverton-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Dhyana-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Genoa-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='auto-ibrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Milan-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amd-psfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='no-nested-data-bp'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='null-sel-clr-base'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='stibp-always-on'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-Rome-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='EPYC-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='GraniteRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-128'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-256'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx10-512'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='prefetchiti'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Haswell-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-noTSX'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v6'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Icelake-Server-v7'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='IvyBridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='KnightsMill-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4fmaps'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-4vnniw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512er'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512pf'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G4-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Opteron_G5-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fma4'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tbm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xop'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SapphireRapids-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='amx-tile'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-bf16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-fp16'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512-vpopcntdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bitalg'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vbmi2'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrc'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fzrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='la57'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='taa-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='tsx-ldtrk'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xfd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='SierraForest-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ifma'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-ne-convert'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx-vnni-int8'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='bus-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cmpccxadd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fbsdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='fsrs'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ibrs-all'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mcdt-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pbrsb-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='psdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='sbdr-ssdp-no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='serialize'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vaes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='vpclmulqdq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Client-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='hle'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='rtm'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Skylake-Server-v5'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512bw'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512cd'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512dq'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512f'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='avx512vl'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='invpcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pcid'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='pku'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='mpx'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v2'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v3'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='core-capability'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='split-lock-detect'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='Snowridge-v4'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='cldemote'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='erms'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='gfni'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdir64b'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='movdiri'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='xsaves'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='athlon-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='core2duo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='coreduo-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='n270-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='ss'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <blockers model='phenom-v1'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnow'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <feature name='3dnowext'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </blockers>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </mode>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <memoryBacking supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <enum name='sourceType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>anonymous</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <value>memfd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </memoryBacking>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <disk supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='diskDevice'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>disk</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cdrom</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>floppy</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>lun</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>fdc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>sata</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <graphics supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vnc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egl-headless</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </graphics>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <video supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='modelType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vga</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>cirrus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>none</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>bochs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ramfb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </video>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hostdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='mode'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>subsystem</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='startupPolicy'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>mandatory</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>requisite</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>optional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='subsysType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pci</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>scsi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='capsType'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='pciBackend'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hostdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <rng supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtio-non-transitional</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>random</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>egd</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <filesystem supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='driverType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>path</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>handle</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>virtiofs</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </filesystem>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <tpm supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-tis</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tpm-crb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emulator</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>external</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendVersion'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>2.0</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </tpm>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <redirdev supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='bus'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>usb</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </redirdev>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <channel supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </channel>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <crypto supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendModel'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>builtin</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </crypto>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <interface supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='backendType'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>default</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>passt</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <panic supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='model'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>isa</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>hyperv</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </panic>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <console supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='type'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>null</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vc</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pty</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dev</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>file</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>pipe</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stdio</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>udp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tcp</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>unix</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>qemu-vdagent</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>dbus</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </console>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   <features>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <gic supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <vmcoreinfo supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <genid supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backingStoreInput supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <backup supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <async-teardown supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <ps2 supported='yes'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sev supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <sgx supported='no'/>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <hyperv supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='features'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>relaxed</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vapic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>spinlocks</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vpindex</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>runtime</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>synic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>stimer</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reset</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>vendor_id</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>frequencies</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>reenlightenment</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tlbflush</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>ipi</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>avic</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>emsr_bitmap</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>xmm_input</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <spinlocks>4095</spinlocks>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <stimer_direct>on</stimer_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_direct>on</tlbflush_direct>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <tlbflush_extended>on</tlbflush_extended>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <vendor_id>Linux KVM Hv</vendor_id>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </defaults>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </hyperv>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     <launchSecurity supported='yes'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       <enum name='sectype'>
Dec 08 20:08:16 compute-0 nova_compute[187787]:         <value>tdx</value>
Dec 08 20:08:16 compute-0 nova_compute[187787]:       </enum>
Dec 08 20:08:16 compute-0 nova_compute[187787]:     </launchSecurity>
Dec 08 20:08:16 compute-0 nova_compute[187787]:   </features>
Dec 08 20:08:16 compute-0 nova_compute[187787]: </domainCapabilities>
Dec 08 20:08:16 compute-0 nova_compute[187787]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.576 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.577 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.577 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.577 187791 INFO nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Secure Boot support detected
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.580 187791 INFO nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.580 187791 INFO nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.594 187791 DEBUG nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.638 187791 INFO nova.virt.node [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Determined node identity b3899b98-89be-4b90-bd85-9c57a93a16c4 from /var/lib/nova/compute_id
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.660 187791 WARNING nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Compute nodes ['b3899b98-89be-4b90-bd85-9c57a93a16c4'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.707 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.781 187791 WARNING nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.782 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.782 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.782 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:08:16 compute-0 nova_compute[187787]: 2025-12-08 20:08:16.782 187791 DEBUG nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:08:16 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Dec 08 20:08:16 compute-0 systemd[1]: Started libvirt nodedev daemon.
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.063 187791 WARNING nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.065 187791 DEBUG nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6122MB free_disk=73.0876693725586GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.065 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.065 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.090 187791 WARNING nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] No compute node record for compute-0.ctlplane.example.com:b3899b98-89be-4b90-bd85-9c57a93a16c4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b3899b98-89be-4b90-bd85-9c57a93a16c4 could not be found.
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.120 187791 INFO nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: b3899b98-89be-4b90-bd85-9c57a93a16c4
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.208 187791 DEBUG nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:08:17 compute-0 nova_compute[187787]: 2025-12-08 20:08:17.209 187791 DEBUG nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.234 187791 INFO nova.scheduler.client.report [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [req-564ed0c5-9964-47af-8bda-645926542b13] Created resource provider record via placement API for resource provider with UUID b3899b98-89be-4b90-bd85-9c57a93a16c4 and name compute-0.ctlplane.example.com.
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.656 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 08 20:08:18 compute-0 nova_compute[187787]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.656 187791 INFO nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] kernel doesn't support AMD SEV
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.657 187791 DEBUG nova.compute.provider_tree [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.658 187791 DEBUG nova.virt.libvirt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.858 187791 DEBUG nova.scheduler.client.report [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Updated inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.858 187791 DEBUG nova.compute.provider_tree [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Updating resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.858 187791 DEBUG nova.compute.provider_tree [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.956 187791 DEBUG nova.compute.provider_tree [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Updating resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.984 187791 DEBUG nova.compute.resource_tracker [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.984 187791 DEBUG oslo_concurrency.lockutils [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.919s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:08:18 compute-0 nova_compute[187787]: 2025-12-08 20:08:18.984 187791 DEBUG nova.service [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Dec 08 20:08:19 compute-0 nova_compute[187787]: 2025-12-08 20:08:19.075 187791 DEBUG nova.service [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Dec 08 20:08:19 compute-0 nova_compute[187787]: 2025-12-08 20:08:19.076 187791 DEBUG nova.servicegroup.drivers.db [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Dec 08 20:08:20 compute-0 sshd-session[188135]: Accepted publickey for zuul from 192.168.122.30 port 40308 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:08:20 compute-0 systemd-logind[793]: New session 25 of user zuul.
Dec 08 20:08:20 compute-0 systemd[1]: Started Session 25 of User zuul.
Dec 08 20:08:20 compute-0 sshd-session[188135]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:08:21 compute-0 python3.9[188288]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 08 20:08:22 compute-0 sudo[188442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cstnsusenshoiwzrpkebsfyjophsjcsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224502.046585-36-103309175619307/AnsiballZ_systemd_service.py'
Dec 08 20:08:22 compute-0 sudo[188442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:23 compute-0 python3.9[188444]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:08:23 compute-0 systemd[1]: Reloading.
Dec 08 20:08:23 compute-0 systemd-rc-local-generator[188470]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:08:23 compute-0 systemd-sysv-generator[188473]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:08:23 compute-0 sudo[188442]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:24 compute-0 python3.9[188629]: ansible-ansible.builtin.service_facts Invoked
Dec 08 20:08:24 compute-0 network[188646]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 08 20:08:24 compute-0 network[188647]: 'network-scripts' will be removed from distribution in near future.
Dec 08 20:08:24 compute-0 network[188648]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 08 20:08:28 compute-0 sudo[188922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhyploydbuhdeanvbpxdczsjmdtbuewp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224507.7298856-55-19959565387146/AnsiballZ_systemd_service.py'
Dec 08 20:08:28 compute-0 sudo[188922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:28 compute-0 python3.9[188924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:08:28 compute-0 sudo[188922]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:29 compute-0 sshd-session[188825]: Invalid user user21 from 222.172.32.246 port 2178
Dec 08 20:08:29 compute-0 sudo[189083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcvszwfrfxteodcohdjvupowaptnlsey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224508.6483238-65-15291811646489/AnsiballZ_file.py'
Dec 08 20:08:29 compute-0 sudo[189083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:29 compute-0 podman[189049]: 2025-12-08 20:08:29.193932918 +0000 UTC m=+0.092307551 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:08:29 compute-0 python3.9[189090]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:29 compute-0 sudo[189083]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:29 compute-0 rsyslogd[1005]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 08 20:08:29 compute-0 sudo[189248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-appjujkaxzofiemjsilqxhvkmwfhhjrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224509.5698278-73-88036119895999/AnsiballZ_file.py'
Dec 08 20:08:29 compute-0 sudo[189248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:30 compute-0 python3.9[189250]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:30 compute-0 sudo[189248]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:30 compute-0 sudo[189400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swzkunpaknmlzsnpltxeczycteumcivj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224510.363367-82-228837628279160/AnsiballZ_command.py'
Dec 08 20:08:30 compute-0 sudo[189400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:30 compute-0 python3.9[189402]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:08:31 compute-0 sudo[189400]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:31 compute-0 python3.9[189554]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 20:08:32 compute-0 sudo[189704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bthvmtsovdebbzwbxybrdomsfrwbcxie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224512.1724975-100-270185678880681/AnsiballZ_systemd_service.py'
Dec 08 20:08:32 compute-0 sudo[189704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:32 compute-0 python3.9[189706]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:08:32 compute-0 systemd[1]: Reloading.
Dec 08 20:08:32 compute-0 systemd-rc-local-generator[189734]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:08:32 compute-0 systemd-sysv-generator[189737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:08:33 compute-0 sudo[189704]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:33 compute-0 sudo[189891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbrvqnnfiglhfwudxznvmpchufonapfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224513.335638-108-18892547694932/AnsiballZ_command.py'
Dec 08 20:08:33 compute-0 sudo[189891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:33 compute-0 python3.9[189893]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:08:33 compute-0 sudo[189891]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:34 compute-0 sudo[190044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfbargkrdsskkebhmitflemjgmiqdupx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224514.132811-117-163772548873010/AnsiballZ_file.py'
Dec 08 20:08:34 compute-0 sudo[190044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:34 compute-0 python3.9[190046]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:08:34 compute-0 sudo[190044]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:35 compute-0 python3.9[190196]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:36 compute-0 podman[190322]: 2025-12-08 20:08:36.255843603 +0000 UTC m=+0.170407825 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:08:36 compute-0 python3.9[190358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:36 compute-0 podman[190468]: 2025-12-08 20:08:36.886915602 +0000 UTC m=+0.065888502 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible)
Dec 08 20:08:37 compute-0 python3.9[190507]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224515.765534-133-193826460884470/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:08:37 compute-0 sudo[190665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbfuofbeyuxvgsjqpcambflfciybmony ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224517.2339602-148-128891441956392/AnsiballZ_group.py'
Dec 08 20:08:37 compute-0 sudo[190665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:37 compute-0 python3.9[190667]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 08 20:08:37 compute-0 sudo[190665]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:38 compute-0 sudo[190817]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yymftbpdenhownwczpfwbudsrgjusmnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224518.2553108-159-149109364244272/AnsiballZ_getent.py'
Dec 08 20:08:38 compute-0 sudo[190817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:38 compute-0 python3.9[190819]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 08 20:08:38 compute-0 sudo[190817]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:39 compute-0 sudo[190970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-butmhmckhotbqtjyxxpxrpwjvfomigra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224519.0830207-167-247067745536030/AnsiballZ_group.py'
Dec 08 20:08:39 compute-0 sudo[190970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:39 compute-0 python3.9[190972]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 08 20:08:39 compute-0 groupadd[190973]: group added to /etc/group: name=ceilometer, GID=42405
Dec 08 20:08:39 compute-0 groupadd[190973]: group added to /etc/gshadow: name=ceilometer
Dec 08 20:08:39 compute-0 groupadd[190973]: new group: name=ceilometer, GID=42405
Dec 08 20:08:39 compute-0 sudo[190970]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:40 compute-0 sudo[191128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tileparmhgvmaplrafkzipqndifqmgai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224519.8098965-175-264246008081994/AnsiballZ_user.py'
Dec 08 20:08:40 compute-0 sudo[191128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:08:40 compute-0 python3.9[191130]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 08 20:08:40 compute-0 useradd[191132]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Dec 08 20:08:40 compute-0 useradd[191132]: add 'ceilometer' to group 'libvirt'
Dec 08 20:08:40 compute-0 useradd[191132]: add 'ceilometer' to shadow group 'libvirt'
Dec 08 20:08:40 compute-0 sudo[191128]: pam_unix(sudo:session): session closed for user root
Dec 08 20:08:41 compute-0 python3.9[191288]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:42 compute-0 python3.9[191409]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765224521.428868-201-103613772430628/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:43 compute-0 python3.9[191559]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:43 compute-0 python3.9[191680]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765224522.5837836-201-178843580958857/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:44 compute-0 python3.9[191830]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:44 compute-0 python3.9[191951]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765224523.8783066-201-1755443476060/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:45 compute-0 python3.9[192101]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:46 compute-0 python3.9[192253]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:08:47 compute-0 python3.9[192405]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:47 compute-0 python3.9[192526]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224526.5320191-260-32978137505406/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:48 compute-0 nova_compute[187787]: 2025-12-08 20:08:48.080 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:08:48 compute-0 nova_compute[187787]: 2025-12-08 20:08:48.102 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:08:48 compute-0 python3.9[192676]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:48 compute-0 python3.9[192752]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:49 compute-0 python3.9[192902]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:49 compute-0 python3.9[193023]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224528.8630047-260-80951560427300/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=4096a0f5410f47dcaf8ab19e56a9d8e211effecd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:50 compute-0 python3.9[193173]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:51 compute-0 python3.9[193294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224530.131814-260-21362569521179/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:51 compute-0 python3.9[193444]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:52 compute-0 python3.9[193565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224531.5215614-260-170052424536829/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:53 compute-0 python3.9[193715]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:53 compute-0 python3.9[193836]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224532.6712203-260-204060336405372/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:54 compute-0 python3.9[193986]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:54 compute-0 python3.9[194107]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224533.8602364-260-158708977732208/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:08:54.980 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:08:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:08:54.981 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:08:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:08:54.981 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:08:55 compute-0 python3.9[194257]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:56 compute-0 python3.9[194378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224535.0331435-260-201515756934158/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:56 compute-0 python3.9[194528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:57 compute-0 python3.9[194649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224536.21307-260-275952139236349/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:57 compute-0 python3.9[194799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:58 compute-0 python3.9[194920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224537.4065282-260-240807865120841/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:08:59 compute-0 python3.9[195070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:08:59 compute-0 podman[195165]: 2025-12-08 20:08:59.449313647 +0000 UTC m=+0.059091033 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:08:59 compute-0 python3.9[195209]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224538.62724-260-89831847927393/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:00 compute-0 python3.9[195361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:00 compute-0 python3.9[195437]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:01 compute-0 python3.9[195587]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:01 compute-0 python3.9[195663]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:02 compute-0 python3.9[195813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:03 compute-0 python3.9[195889]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:03 compute-0 sudo[196039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdbntpgdlxaavjydwupwvuxsqmgnmykt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224543.3266652-449-56537409950819/AnsiballZ_file.py'
Dec 08 20:09:03 compute-0 sudo[196039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:03 compute-0 python3.9[196041]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:03 compute-0 sudo[196039]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:04 compute-0 sudo[196191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhayaprknblpkbsvdruynecgcnnwgivn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224544.027509-457-273826318828520/AnsiballZ_file.py'
Dec 08 20:09:04 compute-0 sudo[196191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:04 compute-0 python3.9[196193]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:04 compute-0 sudo[196191]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:04 compute-0 sudo[196343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-goixueqenvaxqjniqkpkzqvokbokenvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224544.7094712-465-227774850620316/AnsiballZ_file.py'
Dec 08 20:09:04 compute-0 sudo[196343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:05 compute-0 python3.9[196345]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:05 compute-0 sudo[196343]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:05 compute-0 sudo[196495]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-edxuseghodvtqysfpiecamvrulfcdsyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224545.4166439-473-246208095883752/AnsiballZ_systemd_service.py'
Dec 08 20:09:05 compute-0 sudo[196495]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:05 compute-0 python3.9[196497]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:09:06 compute-0 systemd[1]: Reloading.
Dec 08 20:09:06 compute-0 systemd-rc-local-generator[196524]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:06 compute-0 systemd-sysv-generator[196527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:06 compute-0 systemd[1]: Listening on Podman API Socket.
Dec 08 20:09:06 compute-0 sudo[196495]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:06 compute-0 podman[196535]: 2025-12-08 20:09:06.536618301 +0000 UTC m=+0.115210394 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:09:07 compute-0 sudo[196723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-baswwylnkpusrjvpnvvyfibjrdzwmutq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/AnsiballZ_stat.py'
Dec 08 20:09:07 compute-0 sudo[196723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:07 compute-0 podman[196686]: 2025-12-08 20:09:07.116025194 +0000 UTC m=+0.071122931 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:09:07 compute-0 python3.9[196732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:07 compute-0 sudo[196723]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:07 compute-0 sudo[196853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdiwcwvqpgmokemsqfqmdmmsxlpuuvdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/AnsiballZ_copy.py'
Dec 08 20:09:07 compute-0 sudo[196853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:07 compute-0 python3.9[196855]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:07 compute-0 sudo[196853]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:08 compute-0 sudo[196929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtgddnckuxmdygqhitikpcunatskmxwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/AnsiballZ_stat.py'
Dec 08 20:09:08 compute-0 sudo[196929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:08 compute-0 python3.9[196931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:08 compute-0 sudo[196929]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:08 compute-0 sudo[197052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbrajndpywhznzalskprxuwlfdqphord ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/AnsiballZ_copy.py'
Dec 08 20:09:08 compute-0 sudo[197052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:08 compute-0 python3.9[197054]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224546.7711365-482-182186407974455/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:08 compute-0 sudo[197052]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:09 compute-0 sudo[197204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uikgmdjcdfuurhiwqfxrrizyyyximssc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224549.1630516-510-11947525900764/AnsiballZ_container_config_data.py'
Dec 08 20:09:09 compute-0 sudo[197204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:09 compute-0 python3.9[197206]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 08 20:09:09 compute-0 sudo[197204]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:10 compute-0 sudo[197356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykofmkfabxmuwvklkgxailyefifddrmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224550.151653-519-229187833422840/AnsiballZ_container_config_hash.py'
Dec 08 20:09:10 compute-0 sudo[197356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:10 compute-0 python3.9[197358]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:09:10 compute-0 sudo[197356]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:11 compute-0 sudo[197508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvduzzdbltmqmwcsgyefwuvugfbcvpzc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224551.2319613-529-206143551382826/AnsiballZ_edpm_container_manage.py'
Dec 08 20:09:11 compute-0 sudo[197508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:12 compute-0 python3[197510]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:09:12 compute-0 podman[197546]: 2025-12-08 20:09:12.247894409 +0000 UTC m=+0.049903981 container create 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, io.buildah.version=1.41.4, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 08 20:09:12 compute-0 podman[197546]: 2025-12-08 20:09:12.217633301 +0000 UTC m=+0.019642903 image pull b1b6d71b432c07886b3bae74df4dc9841d1f26407d5f96d6c1e400b0154d9a3d quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested
Dec 08 20:09:12 compute-0 python3[197510]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested kolla_start
Dec 08 20:09:12 compute-0 sudo[197508]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:12 compute-0 auditd[706]: Audit daemon rotating log files
Dec 08 20:09:12 compute-0 sudo[197734]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyscgcpfawfvbqqnueneykycrwpohbnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224552.5361903-537-104213825948417/AnsiballZ_stat.py'
Dec 08 20:09:12 compute-0 sudo[197734]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:12 compute-0 python3.9[197736]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:09:13 compute-0 sudo[197734]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:13 compute-0 sudo[197888]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crdievlliqatzitqjsvjmwdhwiaxkvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224553.315016-546-9269674650373/AnsiballZ_file.py'
Dec 08 20:09:13 compute-0 sudo[197888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:13 compute-0 python3.9[197890]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:13 compute-0 sudo[197888]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:14 compute-0 sudo[198039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hapmvcoptpkwkxjbohftnnlfrwbxozpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224553.861542-546-123665371540415/AnsiballZ_copy.py'
Dec 08 20:09:14 compute-0 sudo[198039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:14 compute-0 python3.9[198041]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224553.861542-546-123665371540415/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:14 compute-0 sudo[198039]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.782 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.785 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.785 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.785 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.811 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.811 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.812 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.812 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.812 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.813 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.813 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.813 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.813 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.847 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.848 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.848 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:09:14 compute-0 nova_compute[187787]: 2025-12-08 20:09:14.848 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.077 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.079 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6096MB free_disk=73.08702850341797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.079 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.080 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:09:15 compute-0 sudo[198115]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vjqgojufkgnlwznzbvxpjkxwyntrgyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224553.861542-546-123665371540415/AnsiballZ_systemd.py'
Dec 08 20:09:15 compute-0 sudo[198115]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.185 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.186 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.218 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.243 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.245 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:09:15 compute-0 nova_compute[187787]: 2025-12-08 20:09:15.245 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:09:15 compute-0 python3.9[198117]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:09:15 compute-0 systemd[1]: Reloading.
Dec 08 20:09:15 compute-0 systemd-rc-local-generator[198139]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:15 compute-0 systemd-sysv-generator[198148]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:15 compute-0 sudo[198115]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:16 compute-0 sudo[198227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crxgqhcoiweuspmpwhqpdiuljvsqkltz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224553.861542-546-123665371540415/AnsiballZ_systemd.py'
Dec 08 20:09:16 compute-0 sudo[198227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:16 compute-0 python3.9[198229]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:09:16 compute-0 systemd[1]: Reloading.
Dec 08 20:09:16 compute-0 systemd-sysv-generator[198261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:16 compute-0 systemd-rc-local-generator[198258]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:16 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 08 20:09:16 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.
Dec 08 20:09:16 compute-0 podman[198268]: 2025-12-08 20:09:16.857488051 +0000 UTC m=+0.111004284 container init 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: + sudo -E kolla_set_configs
Dec 08 20:09:16 compute-0 sudo[198289]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: sudo: unable to send audit message: Operation not permitted
Dec 08 20:09:16 compute-0 sudo[198289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 08 20:09:16 compute-0 podman[198268]: 2025-12-08 20:09:16.888322407 +0000 UTC m=+0.141838630 container start 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=edpm)
Dec 08 20:09:16 compute-0 podman[198268]: ceilometer_agent_compute
Dec 08 20:09:16 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 08 20:09:16 compute-0 sudo[198227]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:16 compute-0 podman[198290]: 2025-12-08 20:09:16.957125256 +0000 UTC m=+0.060507166 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:09:16 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-38a7f4c13e95bf38.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:09:16 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-38a7f4c13e95bf38.service: Failed with result 'exit-code'.
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Validating config file
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Copying service configuration files
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: INFO:__main__:Writing out command to execute
Dec 08 20:09:16 compute-0 sudo[198289]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: ++ cat /run_command
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: + ARGS=
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: + sudo kolla_copy_cacerts
Dec 08 20:09:16 compute-0 sudo[198333]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 08 20:09:16 compute-0 ceilometer_agent_compute[198281]: sudo: unable to send audit message: Operation not permitted
Dec 08 20:09:16 compute-0 sudo[198333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 08 20:09:17 compute-0 sudo[198333]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: + [[ ! -n '' ]]
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: + . kolla_extend_start
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: + umask 0022
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 08 20:09:17 compute-0 sudo[198464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brvuokyukoztyyawzteiudlqianyybak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224557.1091235-570-209322744968187/AnsiballZ_systemd.py'
Dec 08 20:09:17 compute-0 sudo[198464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:17 compute-0 python3.9[198466]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:09:17 compute-0 systemd[1]: Stopping ceilometer_agent_compute container...
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.819 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:45
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.819 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.819 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.819 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.819 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.820 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.821 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 WARNING oslo_config.cfg [-] Deprecated: Option "tenant_name_discovery" from group "DEFAULT" is deprecated. Use option "identity_name_discovery" from group "DEFAULT".
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.822 2 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.823 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.824 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.825 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.826 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.827 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.828 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.829 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.830 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.831 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.832 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.833 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.834 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.835 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.859 12 INFO ceilometer.polling.manager [-] Starting heartbeat child service. Listening on /var/lib/ceilometer/ceilometer-compute.socket
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.860 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.861 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.862 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.863 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.864 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.865 12 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.866 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.867 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.868 12 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.869 12 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.870 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.871 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.872 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.873 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.874 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.875 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.876 12 DEBUG cotyledon._service [-] Run service AgentHeartBeatManager(0) [12] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.878 12 DEBUG ceilometer.polling.manager [-] Started heartbeat child process. run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:519
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.881 12 DEBUG ceilometer.polling.manager [-] Started heartbeat update thread _read_queue /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:522
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.882 12 DEBUG ceilometer.polling.manager [-] Started heartbeat reporting thread _report_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:527
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.969 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.12/site-packages/cotyledon/_service_manager.py:319
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.970 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.12/site-packages/cotyledon/_service_manager.py:323
Dec 08 20:09:17 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:17.970 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentHeartBeatManager(0) [12]
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.084 14 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/utils.py:96
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.093 14 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.093 14 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.093 14 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.203 14 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.204 14 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.205 14 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.206 14 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.207 14 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.208 14 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.209 14 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.210 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.211 14 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.212 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.213 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.214 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.215 14 DEBUG cotyledon._service [-] Run service AgentManager(0) [14] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.216 14 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [14]
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198281]: 2025-12-08 20:09:18.227 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.12/site-packages/cotyledon/_service_manager.py:335
Dec 08 20:09:18 compute-0 virtqemud[187722]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 08 20:09:18 compute-0 virtqemud[187722]: hostname: compute-0
Dec 08 20:09:18 compute-0 virtqemud[187722]: End of file while reading data: Input/output error
Dec 08 20:09:18 compute-0 systemd[1]: libpod-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope: Deactivated successfully.
Dec 08 20:09:18 compute-0 systemd[1]: libpod-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope: Consumed 1.607s CPU time.
Dec 08 20:09:18 compute-0 podman[198470]: 2025-12-08 20:09:18.491657872 +0000 UTC m=+0.740638248 container died 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute)
Dec 08 20:09:18 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-38a7f4c13e95bf38.timer: Deactivated successfully.
Dec 08 20:09:18 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.
Dec 08 20:09:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-userdata-shm.mount: Deactivated successfully.
Dec 08 20:09:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331-merged.mount: Deactivated successfully.
Dec 08 20:09:18 compute-0 podman[198470]: 2025-12-08 20:09:18.5356446 +0000 UTC m=+0.784624976 container cleanup 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 08 20:09:18 compute-0 podman[198470]: ceilometer_agent_compute
Dec 08 20:09:18 compute-0 podman[198509]: ceilometer_agent_compute
Dec 08 20:09:18 compute-0 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 08 20:09:18 compute-0 systemd[1]: Stopped ceilometer_agent_compute container.
Dec 08 20:09:18 compute-0 systemd[1]: Starting ceilometer_agent_compute container...
Dec 08 20:09:18 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fc78bd6e095c57782a9dc0c65ae0b3406c43a54c805543911e240811f02fa331/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.
Dec 08 20:09:18 compute-0 podman[198522]: 2025-12-08 20:09:18.710914314 +0000 UTC m=+0.099451740 container init 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + sudo -E kolla_set_configs
Dec 08 20:09:18 compute-0 sudo[198543]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: sudo: unable to send audit message: Operation not permitted
Dec 08 20:09:18 compute-0 sudo[198543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 08 20:09:18 compute-0 podman[198522]: 2025-12-08 20:09:18.734089544 +0000 UTC m=+0.122626960 container start 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4)
Dec 08 20:09:18 compute-0 podman[198522]: ceilometer_agent_compute
Dec 08 20:09:18 compute-0 systemd[1]: Started ceilometer_agent_compute container.
Dec 08 20:09:18 compute-0 sudo[198464]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Validating config file
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Copying service configuration files
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: INFO:__main__:Writing out command to execute
Dec 08 20:09:18 compute-0 sudo[198543]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: ++ cat /run_command
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + ARGS=
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + sudo kolla_copy_cacerts
Dec 08 20:09:18 compute-0 sudo[198569]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: sudo: unable to send audit message: Operation not permitted
Dec 08 20:09:18 compute-0 sudo[198569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Dec 08 20:09:18 compute-0 sudo[198569]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + [[ ! -n '' ]]
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + . kolla_extend_start
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + umask 0022
Dec 08 20:09:18 compute-0 ceilometer_agent_compute[198537]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 08 20:09:18 compute-0 podman[198544]: 2025-12-08 20:09:18.816819341 +0000 UTC m=+0.070530083 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 08 20:09:18 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-7d5e3f42b2984d3c.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:09:18 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-7d5e3f42b2984d3c.service: Failed with result 'exit-code'.
Dec 08 20:09:19 compute-0 sudo[198718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjyailtguburqcxpqaawepsbknbqzcvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224558.9349887-578-9556155444846/AnsiballZ_stat.py'
Dec 08 20:09:19 compute-0 sudo[198718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:19 compute-0 python3.9[198720]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:19 compute-0 sudo[198718]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.608 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:45
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.608 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.608 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.608 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.609 2 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 WARNING oslo_config.cfg [-] Deprecated: Option "tenant_name_discovery" from group "DEFAULT" is deprecated. Use option "identity_name_discovery" from group "DEFAULT".
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.610 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.611 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.612 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.613 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.614 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.615 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.616 2 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.617 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.618 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.619 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.620 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.621 2 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.621 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.639 12 INFO ceilometer.polling.manager [-] Starting heartbeat child service. Listening on /var/lib/ceilometer/ceilometer-compute.socket
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.640 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.641 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.642 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.643 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.644 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.645 12 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.646 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.647 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.648 12 DEBUG cotyledon._service [-] Run service AgentHeartBeatManager(0) [12] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.650 12 DEBUG ceilometer.polling.manager [-] Started heartbeat child process. run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:519
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.652 12 DEBUG ceilometer.polling.manager [-] Started heartbeat update thread _read_queue /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:522
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.652 12 DEBUG ceilometer.polling.manager [-] Started heartbeat reporting thread _report_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:527
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.661 14 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/utils.py:96
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.668 14 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.668 14 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.668 14 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.12/site-packages/cotyledon/oslo_config_glue.py:53
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.790 14 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] enable_notifications           = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] enable_prometheus_exporter     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.791 14 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] heartbeat_socket_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] identity_name_discovery        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] ignore_disabled_projects       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.792 14 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.793 14 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_listen_addresses    = ['127.0.0.1:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.794 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_certfile        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_enable          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] prometheus_tls_keyfile         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.795 14 DEBUG cotyledon.oslo_config_glue [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] threads_to_process_pollsters   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] compute.fetch_extra_metadata   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.796 14 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.12/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.797 14 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_notifications   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.enable_prometheus_exporter = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.heartbeat_socket_dir   = /var/lib/ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.identity_name_discovery = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.ignore_disabled_projects = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_listen_addresses = ['[::]:9101'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_certfile = /etc/ceilometer/tls/tls.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.798 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_enable  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] polling.prometheus_tls_keyfile = /etc/ceilometer/tls/tls.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] polling.threads_to_process_pollsters = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] service_types.aodh             = alarming log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.799 14 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.800 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.801 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.802 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.803 14 DEBUG cotyledon._service [-] Run service AgentManager(0) [14] wait_forever /usr/lib/python3.12/site-packages/cotyledon/_service.py:263
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.808 14 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.12/site-packages/ceilometer/agent.py:64
Dec 08 20:09:19 compute-0 sudo[198850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fwsllqbhtzodqrypysvyjknqritipdrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224558.9349887-578-9556155444846/AnsiballZ_copy.py'
Dec 08 20:09:19 compute-0 sudo[198850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.832 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.832 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.833 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.833 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.833 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.834 14 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/utils.py:96
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.834 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.839 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.839 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.848 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.848 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.848 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.848 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.849 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:09:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:09:19 compute-0 python3.9[198853]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224558.9349887-578-9556155444846/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:20 compute-0 sudo[198850]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:20 compute-0 sudo[199006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imwgfiikosvysngbeyufpsglwsohctjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224560.2880769-595-162518144624887/AnsiballZ_container_config_data.py'
Dec 08 20:09:20 compute-0 sudo[199006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:21 compute-0 python3.9[199008]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 08 20:09:21 compute-0 sudo[199006]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:21 compute-0 sudo[199160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezkuhzcmsmkiwualcpqhwiycywuqddqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224561.3150685-604-131336454687149/AnsiballZ_container_config_hash.py'
Dec 08 20:09:21 compute-0 sudo[199160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:21 compute-0 python3.9[199162]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:09:21 compute-0 sudo[199160]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:22 compute-0 sudo[199312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxohhbvjzikribxrqwfabsokotwzengl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224562.120755-614-265238268362541/AnsiballZ_edpm_container_manage.py'
Dec 08 20:09:22 compute-0 sudo[199312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:22 compute-0 python3[199314]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:09:22 compute-0 podman[199350]: 2025-12-08 20:09:22.965283817 +0000 UTC m=+0.116504354 container create 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter)
Dec 08 20:09:22 compute-0 podman[199350]: 2025-12-08 20:09:22.876148013 +0000 UTC m=+0.027368630 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 08 20:09:22 compute-0 python3[199314]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 08 20:09:23 compute-0 sudo[199312]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:23 compute-0 sudo[199539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xefiknpsflrijzgzkmawmoxzzfrgyxac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224563.253528-622-49671233549764/AnsiballZ_stat.py'
Dec 08 20:09:23 compute-0 sudo[199539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:23 compute-0 sshd-session[199056]: Received disconnect from 45.78.217.210 port 55176:11: Bye Bye [preauth]
Dec 08 20:09:23 compute-0 sshd-session[199056]: Disconnected from authenticating user root 45.78.217.210 port 55176 [preauth]
Dec 08 20:09:23 compute-0 python3.9[199541]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:09:23 compute-0 sudo[199539]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:24 compute-0 sudo[199693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgruopfgjjnjiipgpdjqhhcbmouwhtfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224563.980438-631-30244496342516/AnsiballZ_file.py'
Dec 08 20:09:24 compute-0 sudo[199693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:24 compute-0 python3.9[199695]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:24 compute-0 sudo[199693]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:24 compute-0 sudo[199844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aggptvmwcxxsijvjgxpzicrlgxhpycva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224564.5146558-631-239732378396301/AnsiballZ_copy.py'
Dec 08 20:09:24 compute-0 sudo[199844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:25 compute-0 python3.9[199846]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224564.5146558-631-239732378396301/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:25 compute-0 sudo[199844]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:25 compute-0 sudo[199920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upneldimbzhwhhsadasjlmdmjiwirelb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224564.5146558-631-239732378396301/AnsiballZ_systemd.py'
Dec 08 20:09:25 compute-0 sudo[199920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:25 compute-0 python3.9[199922]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:09:25 compute-0 systemd[1]: Reloading.
Dec 08 20:09:25 compute-0 systemd-sysv-generator[199951]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:25 compute-0 systemd-rc-local-generator[199948]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:26 compute-0 sudo[199920]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:26 compute-0 sudo[200031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daclwhywdnwyduhzdgkqpmycxpjeabjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224564.5146558-631-239732378396301/AnsiballZ_systemd.py'
Dec 08 20:09:26 compute-0 sudo[200031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:26 compute-0 python3.9[200033]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:09:26 compute-0 systemd[1]: Reloading.
Dec 08 20:09:26 compute-0 systemd-sysv-generator[200064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:26 compute-0 systemd-rc-local-generator[200061]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:26 compute-0 systemd[1]: Starting node_exporter container...
Dec 08 20:09:27 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ef95cd9a9a83c195efb0de9d99d08d558c2cfa41917a0218060eef9875914e1/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ef95cd9a9a83c195efb0de9d99d08d558c2cfa41917a0218060eef9875914e1/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:27 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.
Dec 08 20:09:27 compute-0 podman[200072]: 2025-12-08 20:09:27.127991476 +0000 UTC m=+0.148836632 container init 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.144Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.144Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.144Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.145Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.145Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.145Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=arp
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=bcache
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=bonding
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=cpu
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=edac
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=filefd
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=netclass
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=netdev
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=netstat
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=nfs
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=nvme
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=softnet
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=systemd
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=xfs
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.146Z caller=node_exporter.go:117 level=info collector=zfs
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.147Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 08 20:09:27 compute-0 node_exporter[200085]: ts=2025-12-08T20:09:27.148Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 08 20:09:27 compute-0 podman[200072]: 2025-12-08 20:09:27.155747102 +0000 UTC m=+0.176592248 container start 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:09:27 compute-0 podman[200072]: node_exporter
Dec 08 20:09:27 compute-0 systemd[1]: Started node_exporter container.
Dec 08 20:09:27 compute-0 sudo[200031]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:27 compute-0 podman[200096]: 2025-12-08 20:09:27.223905852 +0000 UTC m=+0.055673719 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:09:27 compute-0 sudo[200271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npepbiurbilvbhfqxnnxkyuwiplgskhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224567.3877063-655-243724060401663/AnsiballZ_systemd.py'
Dec 08 20:09:27 compute-0 sudo[200271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:28 compute-0 python3.9[200273]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:09:28 compute-0 systemd[1]: Stopping node_exporter container...
Dec 08 20:09:28 compute-0 systemd[1]: libpod-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope: Deactivated successfully.
Dec 08 20:09:28 compute-0 conmon[200085]: conmon 0223112eab84324c03f1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope/container/memory.events
Dec 08 20:09:28 compute-0 podman[200277]: 2025-12-08 20:09:28.251775089 +0000 UTC m=+0.099923940 container died 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:09:28 compute-0 systemd[1]: 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8-627c246244cb9124.timer: Deactivated successfully.
Dec 08 20:09:28 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.
Dec 08 20:09:28 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8-userdata-shm.mount: Deactivated successfully.
Dec 08 20:09:28 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ef95cd9a9a83c195efb0de9d99d08d558c2cfa41917a0218060eef9875914e1-merged.mount: Deactivated successfully.
Dec 08 20:09:28 compute-0 podman[200277]: 2025-12-08 20:09:28.296568656 +0000 UTC m=+0.144717507 container cleanup 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:09:28 compute-0 podman[200277]: node_exporter
Dec 08 20:09:28 compute-0 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 08 20:09:28 compute-0 podman[200304]: node_exporter
Dec 08 20:09:28 compute-0 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 08 20:09:28 compute-0 systemd[1]: Stopped node_exporter container.
Dec 08 20:09:28 compute-0 systemd[1]: Starting node_exporter container...
Dec 08 20:09:28 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ef95cd9a9a83c195efb0de9d99d08d558c2cfa41917a0218060eef9875914e1/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:28 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ef95cd9a9a83c195efb0de9d99d08d558c2cfa41917a0218060eef9875914e1/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:28 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.
Dec 08 20:09:28 compute-0 podman[200317]: 2025-12-08 20:09:28.489610795 +0000 UTC m=+0.103997614 container init 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.502Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.502Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.502Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.502Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.502Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=arp
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=bcache
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=bonding
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=cpu
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=edac
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=filefd
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=netclass
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=netdev
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=netstat
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=nfs
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=nvme
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=softnet
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=systemd
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=xfs
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.503Z caller=node_exporter.go:117 level=info collector=zfs
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.504Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 08 20:09:28 compute-0 node_exporter[200333]: ts=2025-12-08T20:09:28.504Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 08 20:09:28 compute-0 podman[200317]: 2025-12-08 20:09:28.518507835 +0000 UTC m=+0.132894664 container start 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 08 20:09:28 compute-0 podman[200317]: node_exporter
Dec 08 20:09:28 compute-0 systemd[1]: Started node_exporter container.
Dec 08 20:09:28 compute-0 sudo[200271]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:28 compute-0 podman[200342]: 2025-12-08 20:09:28.576592827 +0000 UTC m=+0.049546182 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:09:29 compute-0 sudo[200516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hohutfbmtkkjaqavomznmwuncalksgne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224568.7356584-663-79023863793327/AnsiballZ_stat.py'
Dec 08 20:09:29 compute-0 sudo[200516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:29 compute-0 python3.9[200518]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:29 compute-0 sudo[200516]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:29 compute-0 sshd-session[200138]: Invalid user sysadmin from 45.78.228.32 port 40342
Dec 08 20:09:29 compute-0 sudo[200650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymkjeqjzafdxusgdruekoxdzqnqbrwwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224568.7356584-663-79023863793327/AnsiballZ_copy.py'
Dec 08 20:09:29 compute-0 sudo[200650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:29 compute-0 podman[200613]: 2025-12-08 20:09:29.616744799 +0000 UTC m=+0.081961431 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 08 20:09:29 compute-0 sshd-session[200138]: Received disconnect from 45.78.228.32 port 40342:11: Bye Bye [preauth]
Dec 08 20:09:29 compute-0 sshd-session[200138]: Disconnected from invalid user sysadmin 45.78.228.32 port 40342 [preauth]
Dec 08 20:09:29 compute-0 python3.9[200657]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224568.7356584-663-79023863793327/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:29 compute-0 sudo[200650]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:30 compute-0 sudo[200807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtsxrvdpxqorkhvsfduynetvjuwhonjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224570.1738367-680-184640394032780/AnsiballZ_container_config_data.py'
Dec 08 20:09:30 compute-0 sudo[200807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:30 compute-0 python3.9[200809]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 08 20:09:30 compute-0 sudo[200807]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:31 compute-0 sudo[200959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyohpmwksnjbddpmcawhdrakvrpojxwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224571.0246782-689-160495370863333/AnsiballZ_container_config_hash.py'
Dec 08 20:09:31 compute-0 sudo[200959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:31 compute-0 python3.9[200961]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:09:31 compute-0 sudo[200959]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:32 compute-0 sudo[201111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbpugviyrcxcyjwjpfuqznfttumtxvag ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224571.851013-699-163597025770456/AnsiballZ_edpm_container_manage.py'
Dec 08 20:09:32 compute-0 sudo[201111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:32 compute-0 python3[201113]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:09:34 compute-0 podman[201128]: 2025-12-08 20:09:34.812253016 +0000 UTC m=+2.350463304 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 08 20:09:34 compute-0 podman[201225]: 2025-12-08 20:09:34.951154454 +0000 UTC m=+0.051720999 container create 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Dec 08 20:09:34 compute-0 podman[201225]: 2025-12-08 20:09:34.923678615 +0000 UTC m=+0.024245180 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 08 20:09:34 compute-0 python3[201113]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 08 20:09:35 compute-0 sudo[201111]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:35 compute-0 sudo[201413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beprtnfpkjkiyimazggedycwlfdlifpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224575.2816725-707-223910435536459/AnsiballZ_stat.py'
Dec 08 20:09:35 compute-0 sudo[201413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:35 compute-0 python3.9[201415]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:09:35 compute-0 sudo[201413]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:36 compute-0 sudo[201567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tclkxgeqmspxeleclajlpgcmbjxqiohb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224576.0422113-716-92172088907339/AnsiballZ_file.py'
Dec 08 20:09:36 compute-0 sudo[201567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:36 compute-0 python3.9[201569]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:36 compute-0 sudo[201567]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:37 compute-0 sudo[201730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eskykjvunxrwfwharywbeisbzahutwxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224576.615156-716-40986495434234/AnsiballZ_copy.py'
Dec 08 20:09:37 compute-0 sudo[201730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:37 compute-0 podman[201692]: 2025-12-08 20:09:37.151834799 +0000 UTC m=+0.093336458 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 08 20:09:37 compute-0 podman[201746]: 2025-12-08 20:09:37.242782214 +0000 UTC m=+0.064399696 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 08 20:09:37 compute-0 python3.9[201739]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224576.615156-716-40986495434234/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:37 compute-0 sudo[201730]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:37 compute-0 sudo[201839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbijalnjkbwftvizcpfjwyncyvfhkppt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224576.615156-716-40986495434234/AnsiballZ_systemd.py'
Dec 08 20:09:37 compute-0 sudo[201839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:37 compute-0 python3.9[201841]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:09:37 compute-0 systemd[1]: Reloading.
Dec 08 20:09:37 compute-0 systemd-rc-local-generator[201867]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:37 compute-0 systemd-sysv-generator[201871]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:38 compute-0 sudo[201839]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:38 compute-0 sudo[201950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xyrbaiatuoauukqorjzukubzzuywjqun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224576.615156-716-40986495434234/AnsiballZ_systemd.py'
Dec 08 20:09:38 compute-0 sudo[201950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:38 compute-0 python3.9[201952]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:09:38 compute-0 systemd[1]: Reloading.
Dec 08 20:09:38 compute-0 systemd-rc-local-generator[201979]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:38 compute-0 systemd-sysv-generator[201984]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:39 compute-0 systemd[1]: Starting podman_exporter container...
Dec 08 20:09:39 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb0bea40a97a8a8bec58e49ef653c38d59988352dac8d5803d165ad46217cc1c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb0bea40a97a8a8bec58e49ef653c38d59988352dac8d5803d165ad46217cc1c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:39 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.
Dec 08 20:09:39 compute-0 podman[201991]: 2025-12-08 20:09:39.298277241 +0000 UTC m=+0.124304714 container init 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.323Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.323Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.324Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.324Z caller=handler.go:105 level=info collector=container
Dec 08 20:09:39 compute-0 podman[201991]: 2025-12-08 20:09:39.325306835 +0000 UTC m=+0.151334238 container start 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:09:39 compute-0 podman[201991]: podman_exporter
Dec 08 20:09:39 compute-0 systemd[1]: Starting Podman API Service...
Dec 08 20:09:39 compute-0 systemd[1]: Started Podman API Service.
Dec 08 20:09:39 compute-0 systemd[1]: Started podman_exporter container.
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="Setting parallel job count to 25"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="Using sqlite as database backend"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 08 20:09:39 compute-0 podman[202017]: @ - - [08/Dec/2025:20:09:39 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 08 20:09:39 compute-0 podman[202017]: time="2025-12-08T20:09:39Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:09:39 compute-0 sudo[201950]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:39 compute-0 podman[202015]: 2025-12-08 20:09:39.393282989 +0000 UTC m=+0.056315249 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:09:39 compute-0 podman[202017]: @ - - [08/Dec/2025:20:09:39 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19588 "" "Go-http-client/1.1"
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.394Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.395Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 08 20:09:39 compute-0 podman_exporter[202006]: ts=2025-12-08T20:09:39.396Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 08 20:09:39 compute-0 systemd[1]: 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997-761fbb227fdd64e3.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:09:39 compute-0 systemd[1]: 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997-761fbb227fdd64e3.service: Failed with result 'exit-code'.
Dec 08 20:09:39 compute-0 sudo[202203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxnbncpusshcidhewbtezrynnyipesuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224579.546422-740-53069838256557/AnsiballZ_systemd.py'
Dec 08 20:09:39 compute-0 sudo[202203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:40 compute-0 python3.9[202205]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:09:40 compute-0 systemd[1]: Stopping podman_exporter container...
Dec 08 20:09:40 compute-0 podman[202017]: @ - - [08/Dec/2025:20:09:39 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1449 "" "Go-http-client/1.1"
Dec 08 20:09:40 compute-0 systemd[1]: libpod-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.scope: Deactivated successfully.
Dec 08 20:09:40 compute-0 podman[202209]: 2025-12-08 20:09:40.348400747 +0000 UTC m=+0.112724510 container died 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:09:40 compute-0 systemd[1]: 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997-761fbb227fdd64e3.timer: Deactivated successfully.
Dec 08 20:09:40 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.
Dec 08 20:09:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997-userdata-shm.mount: Deactivated successfully.
Dec 08 20:09:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-fb0bea40a97a8a8bec58e49ef653c38d59988352dac8d5803d165ad46217cc1c-merged.mount: Deactivated successfully.
Dec 08 20:09:40 compute-0 podman[202209]: 2025-12-08 20:09:40.672128381 +0000 UTC m=+0.436452104 container cleanup 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:09:40 compute-0 podman[202209]: podman_exporter
Dec 08 20:09:40 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 08 20:09:40 compute-0 podman[202239]: podman_exporter
Dec 08 20:09:40 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 08 20:09:40 compute-0 systemd[1]: Stopped podman_exporter container.
Dec 08 20:09:40 compute-0 systemd[1]: Starting podman_exporter container...
Dec 08 20:09:40 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb0bea40a97a8a8bec58e49ef653c38d59988352dac8d5803d165ad46217cc1c/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fb0bea40a97a8a8bec58e49ef653c38d59988352dac8d5803d165ad46217cc1c/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.
Dec 08 20:09:40 compute-0 podman[202252]: 2025-12-08 20:09:40.883796369 +0000 UTC m=+0.109265934 container init 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.900Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.900Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.900Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.900Z caller=handler.go:105 level=info collector=container
Dec 08 20:09:40 compute-0 podman[202017]: @ - - [08/Dec/2025:20:09:40 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 08 20:09:40 compute-0 podman[202017]: time="2025-12-08T20:09:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:09:40 compute-0 podman[202252]: 2025-12-08 20:09:40.91394706 +0000 UTC m=+0.139416595 container start 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:09:40 compute-0 podman[202252]: podman_exporter
Dec 08 20:09:40 compute-0 podman[202017]: @ - - [08/Dec/2025:20:09:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19590 "" "Go-http-client/1.1"
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.931Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.932Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 08 20:09:40 compute-0 podman_exporter[202267]: ts=2025-12-08T20:09:40.932Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 08 20:09:40 compute-0 systemd[1]: Started podman_exporter container.
Dec 08 20:09:40 compute-0 sudo[202203]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:40 compute-0 podman[202276]: 2025-12-08 20:09:40.972606899 +0000 UTC m=+0.049277295 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:09:41 compute-0 sudo[202450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upgwimriykkscejfohjctidvxcdduaba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224581.142087-748-170323142685893/AnsiballZ_stat.py'
Dec 08 20:09:41 compute-0 sudo[202450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:41 compute-0 python3.9[202452]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:09:41 compute-0 sudo[202450]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:42 compute-0 sudo[202573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayupyutaevtuxytpdzvaahwsmoqvtifs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224581.142087-748-170323142685893/AnsiballZ_copy.py'
Dec 08 20:09:42 compute-0 sudo[202573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:42 compute-0 python3.9[202575]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765224581.142087-748-170323142685893/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 08 20:09:42 compute-0 sudo[202573]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:43 compute-0 sudo[202725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgmsypewvynzfyrlrgttusrfmvarsepo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224582.8539202-765-26094262826640/AnsiballZ_container_config_data.py'
Dec 08 20:09:43 compute-0 sudo[202725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:43 compute-0 python3.9[202727]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 08 20:09:43 compute-0 sudo[202725]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:43 compute-0 sudo[202877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eycxaqdxyozoqhxaquqamahibjqzeywr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224583.5719218-774-1239524269490/AnsiballZ_container_config_hash.py'
Dec 08 20:09:43 compute-0 sudo[202877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:44 compute-0 python3.9[202879]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 08 20:09:44 compute-0 sudo[202877]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:44 compute-0 sudo[203029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mksflkdlbxigsxwyalxpekwkgpydlnkb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224584.4420252-784-103172802315980/AnsiballZ_edpm_container_manage.py'
Dec 08 20:09:44 compute-0 sudo[203029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:45 compute-0 python3[203031]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 08 20:09:47 compute-0 podman[203045]: 2025-12-08 20:09:47.216590981 +0000 UTC m=+2.116788808 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 08 20:09:47 compute-0 podman[203141]: 2025-12-08 20:09:47.345664318 +0000 UTC m=+0.041801616 container create adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, vcs-type=git, io.buildah.version=1.33.7)
Dec 08 20:09:47 compute-0 podman[203141]: 2025-12-08 20:09:47.323803352 +0000 UTC m=+0.019940670 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 08 20:09:47 compute-0 python3[203031]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 08 20:09:47 compute-0 sudo[203029]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:47 compute-0 sudo[203329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wymrbdofemffgqisrhgtsggekmpqenxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224587.6116493-792-207573863021150/AnsiballZ_stat.py'
Dec 08 20:09:47 compute-0 sudo[203329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:48 compute-0 python3.9[203331]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:09:48 compute-0 sudo[203329]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:48 compute-0 sudo[203485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxbpnopdkqnnonykkodphzarwxhekfot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224588.340898-801-95339879603213/AnsiballZ_file.py'
Dec 08 20:09:48 compute-0 sudo[203485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:48 compute-0 podman[203487]: 2025-12-08 20:09:48.923751701 +0000 UTC m=+0.060734354 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 08 20:09:48 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-7d5e3f42b2984d3c.service: Main process exited, code=exited, status=1/FAILURE
Dec 08 20:09:48 compute-0 systemd[1]: 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964-7d5e3f42b2984d3c.service: Failed with result 'exit-code'.
Dec 08 20:09:49 compute-0 python3.9[203488]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:49 compute-0 sudo[203485]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:49 compute-0 sudo[203656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jobbnknvqcdzzeqojhspnyzfklzuqlmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224589.1340504-801-91768514256058/AnsiballZ_copy.py'
Dec 08 20:09:49 compute-0 sudo[203656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:49 compute-0 python3.9[203658]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765224589.1340504-801-91768514256058/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:09:49 compute-0 sudo[203656]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:50 compute-0 sudo[203732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ylicdjzlxbkgelpvknabcpyvoldbqpso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224589.1340504-801-91768514256058/AnsiballZ_systemd.py'
Dec 08 20:09:50 compute-0 sudo[203732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:50 compute-0 python3.9[203734]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 08 20:09:50 compute-0 systemd[1]: Reloading.
Dec 08 20:09:50 compute-0 systemd-rc-local-generator[203760]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:50 compute-0 systemd-sysv-generator[203768]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:50 compute-0 sudo[203732]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:50 compute-0 sshd-session[203385]: Invalid user ubuntu from 101.47.160.247 port 57534
Dec 08 20:09:50 compute-0 sudo[203844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-malqwstwckpsjniqkhmprwihstlhaigf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224589.1340504-801-91768514256058/AnsiballZ_systemd.py'
Dec 08 20:09:50 compute-0 sudo[203844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:51 compute-0 sshd-session[203385]: Received disconnect from 101.47.160.247 port 57534:11: Bye Bye [preauth]
Dec 08 20:09:51 compute-0 sshd-session[203385]: Disconnected from invalid user ubuntu 101.47.160.247 port 57534 [preauth]
Dec 08 20:09:51 compute-0 python3.9[203846]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 08 20:09:51 compute-0 systemd[1]: Reloading.
Dec 08 20:09:51 compute-0 systemd-rc-local-generator[203877]: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 08 20:09:51 compute-0 systemd-sysv-generator[203881]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 08 20:09:51 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 08 20:09:51 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.
Dec 08 20:09:51 compute-0 podman[203887]: 2025-12-08 20:09:51.909144235 +0000 UTC m=+0.231223154 container init adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *bridge.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *coverage.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *datapath.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *iface.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *memory.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *ovnnorthd.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *ovn.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *ovsdbserver.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *pmd_perf.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *pmd_rxq.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: INFO    20:09:51 main.go:48: registering *vswitch.Collector
Dec 08 20:09:51 compute-0 openstack_network_exporter[203903]: NOTICE  20:09:51 main.go:76: listening on https://:9105/metrics
Dec 08 20:09:51 compute-0 podman[203887]: 2025-12-08 20:09:51.944388431 +0000 UTC m=+0.266467330 container start adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Dec 08 20:09:51 compute-0 podman[203887]: openstack_network_exporter
Dec 08 20:09:51 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 08 20:09:51 compute-0 sudo[203844]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:52 compute-0 podman[203913]: 2025-12-08 20:09:52.048746944 +0000 UTC m=+0.081072324 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 08 20:09:52 compute-0 sudo[204086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grrorurifsiockjqojrocmhznyafwcja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224592.1927655-825-198367697361796/AnsiballZ_systemd.py'
Dec 08 20:09:52 compute-0 sudo[204086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:52 compute-0 python3.9[204088]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 08 20:09:52 compute-0 sshd-session[204089]: Received disconnect from 172.190.42.55 port 53750:11: Bye Bye [preauth]
Dec 08 20:09:52 compute-0 sshd-session[204089]: Disconnected from authenticating user root 172.190.42.55 port 53750 [preauth]
Dec 08 20:09:52 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Dec 08 20:09:52 compute-0 systemd[1]: libpod-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.scope: Deactivated successfully.
Dec 08 20:09:52 compute-0 podman[204094]: 2025-12-08 20:09:52.897014072 +0000 UTC m=+0.047453249 container died adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, io.openshift.expose-services=, vcs-type=git, version=9.6, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 08 20:09:52 compute-0 systemd[1]: adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8-c1e1abfdb4ec011.timer: Deactivated successfully.
Dec 08 20:09:52 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.
Dec 08 20:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8-userdata-shm.mount: Deactivated successfully.
Dec 08 20:09:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1-merged.mount: Deactivated successfully.
Dec 08 20:09:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:09:54.982 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:09:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:09:54.983 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:09:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:09:54.983 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:09:55 compute-0 podman[204094]: 2025-12-08 20:09:55.144115974 +0000 UTC m=+2.294555161 container cleanup adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, version=9.6, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 08 20:09:55 compute-0 podman[204094]: openstack_network_exporter
Dec 08 20:09:55 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 08 20:09:55 compute-0 podman[204120]: openstack_network_exporter
Dec 08 20:09:55 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 08 20:09:55 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Dec 08 20:09:55 compute-0 systemd[1]: Starting openstack_network_exporter container...
Dec 08 20:09:55 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:09:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:55 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c2d799371963b4303bcae98eabcfa345ec21c88e21de1a04daf6ed589d2cf1/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 08 20:09:55 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.
Dec 08 20:09:55 compute-0 podman[204133]: 2025-12-08 20:09:55.653301697 +0000 UTC m=+0.409193164 container init adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, managed_by=edpm_ansible)
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *bridge.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *coverage.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *datapath.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *iface.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *memory.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *ovnnorthd.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *ovn.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *ovsdbserver.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *pmd_perf.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *pmd_rxq.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: INFO    20:09:55 main.go:48: registering *vswitch.Collector
Dec 08 20:09:55 compute-0 openstack_network_exporter[204149]: NOTICE  20:09:55 main.go:76: listening on https://:9105/metrics
Dec 08 20:09:55 compute-0 podman[204133]: 2025-12-08 20:09:55.683301512 +0000 UTC m=+0.439192959 container start adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, vcs-type=git, distribution-scope=public, io.openshift.expose-services=)
Dec 08 20:09:55 compute-0 podman[204133]: openstack_network_exporter
Dec 08 20:09:55 compute-0 systemd[1]: Started openstack_network_exporter container.
Dec 08 20:09:55 compute-0 sudo[204086]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:55 compute-0 podman[204159]: 2025-12-08 20:09:55.81106794 +0000 UTC m=+0.115766762 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=edpm, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, io.openshift.tags=minimal rhel9)
Dec 08 20:09:56 compute-0 sudo[204328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pnjbxsgqmmucnmrxmexdkuqcgrmhxuws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224595.9880438-833-151229716244232/AnsiballZ_find.py'
Dec 08 20:09:56 compute-0 sudo[204328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:56 compute-0 python3.9[204330]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 08 20:09:56 compute-0 sudo[204328]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:57 compute-0 sudo[204480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkrqkftudxegftzxtcolqvwlsjwajhzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224596.938947-843-160700149037178/AnsiballZ_podman_container_info.py'
Dec 08 20:09:57 compute-0 sudo[204480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:57 compute-0 python3.9[204482]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 08 20:09:57 compute-0 sudo[204480]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:58 compute-0 sudo[204645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uumveehpxcueoxaondgqhildoqvlephw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224598.0089371-851-42458216142512/AnsiballZ_podman_container_exec.py'
Dec 08 20:09:58 compute-0 sudo[204645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:58 compute-0 python3.9[204647]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:09:58 compute-0 systemd[1]: Started libpod-conmon-b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1.scope.
Dec 08 20:09:58 compute-0 podman[204648]: 2025-12-08 20:09:58.757844586 +0000 UTC m=+0.079496546 container exec b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 08 20:09:58 compute-0 podman[204648]: 2025-12-08 20:09:58.792277977 +0000 UTC m=+0.113929937 container exec_died b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:09:58 compute-0 podman[204665]: 2025-12-08 20:09:58.816802695 +0000 UTC m=+0.060479246 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:09:58 compute-0 systemd[1]: libpod-conmon-b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1.scope: Deactivated successfully.
Dec 08 20:09:58 compute-0 sudo[204645]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:59 compute-0 sudo[204851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvethskvjkiosbmsythompintuattsxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224598.9862182-859-237329742750224/AnsiballZ_podman_container_exec.py'
Dec 08 20:09:59 compute-0 sudo[204851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:09:59 compute-0 python3.9[204853]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:09:59 compute-0 systemd[1]: Started libpod-conmon-b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1.scope.
Dec 08 20:09:59 compute-0 podman[204854]: 2025-12-08 20:09:59.565151534 +0000 UTC m=+0.090580454 container exec b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:09:59 compute-0 podman[204874]: 2025-12-08 20:09:59.63613014 +0000 UTC m=+0.057324360 container exec_died b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 08 20:09:59 compute-0 podman[204854]: 2025-12-08 20:09:59.652434047 +0000 UTC m=+0.177862977 container exec_died b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:09:59 compute-0 systemd[1]: libpod-conmon-b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1.scope: Deactivated successfully.
Dec 08 20:09:59 compute-0 sudo[204851]: pam_unix(sudo:session): session closed for user root
Dec 08 20:09:59 compute-0 podman[204887]: 2025-12-08 20:09:59.865779776 +0000 UTC m=+0.170487283 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:10:00 compute-0 sudo[205055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzgehxtvixjhttaocdbbftdrcfiupkir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224600.0174398-867-58020740605264/AnsiballZ_file.py'
Dec 08 20:10:00 compute-0 sudo[205055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:00 compute-0 python3.9[205057]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:00 compute-0 sudo[205055]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:01 compute-0 sudo[205207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfrnxhxkwrvdjlggrmajbycnsoocidou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224600.7891376-876-48877113506818/AnsiballZ_podman_container_info.py'
Dec 08 20:10:01 compute-0 sudo[205207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:01 compute-0 python3.9[205209]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 08 20:10:01 compute-0 sudo[205207]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:01 compute-0 sudo[205372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhiwnfaalwyewqcagwzgtjfdvwpviswq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224601.5429392-884-155927187085966/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:01 compute-0 sudo[205372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:02 compute-0 python3.9[205374]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:02 compute-0 systemd[1]: Started libpod-conmon-2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398.scope.
Dec 08 20:10:02 compute-0 podman[205375]: 2025-12-08 20:10:02.252048633 +0000 UTC m=+0.179951741 container exec 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 08 20:10:02 compute-0 podman[205394]: 2025-12-08 20:10:02.335024944 +0000 UTC m=+0.064006843 container exec_died 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 08 20:10:02 compute-0 podman[205375]: 2025-12-08 20:10:02.378569542 +0000 UTC m=+0.306472640 container exec_died 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:10:02 compute-0 systemd[1]: libpod-conmon-2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398.scope: Deactivated successfully.
Dec 08 20:10:02 compute-0 sudo[205372]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:02 compute-0 sudo[205556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmtvuynjzkhcfecaqfesnfqixuywwhsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224602.626849-892-197335247457811/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:02 compute-0 sudo[205556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:03 compute-0 python3.9[205558]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:03 compute-0 systemd[1]: Started libpod-conmon-2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398.scope.
Dec 08 20:10:03 compute-0 podman[205559]: 2025-12-08 20:10:03.27334862 +0000 UTC m=+0.081026053 container exec 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Dec 08 20:10:03 compute-0 podman[205559]: 2025-12-08 20:10:03.307175102 +0000 UTC m=+0.114852545 container exec_died 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 08 20:10:03 compute-0 systemd[1]: libpod-conmon-2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398.scope: Deactivated successfully.
Dec 08 20:10:03 compute-0 sudo[205556]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:03 compute-0 sudo[205739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-smovnvhjqsrvcrjutidlbuptiimtmdgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224603.5587878-900-206510583540099/AnsiballZ_file.py'
Dec 08 20:10:03 compute-0 sudo[205739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:04 compute-0 python3.9[205741]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:04 compute-0 sudo[205739]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:04 compute-0 sudo[205891]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyykvkmiihojnciqltbephnizalyraev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224604.3957288-909-112570898124541/AnsiballZ_podman_container_info.py'
Dec 08 20:10:04 compute-0 sudo[205891]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:04 compute-0 python3.9[205893]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 08 20:10:05 compute-0 sudo[205891]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:05 compute-0 sudo[206056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xybvnaswgrtzwdllquppyylbujfcsdap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224605.2208192-917-115205111268833/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:05 compute-0 sudo[206056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:05 compute-0 python3.9[206058]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:05 compute-0 systemd[1]: Started libpod-conmon-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.scope.
Dec 08 20:10:05 compute-0 podman[206059]: 2025-12-08 20:10:05.786390924 +0000 UTC m=+0.087034656 container exec c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:10:05 compute-0 podman[206059]: 2025-12-08 20:10:05.816930996 +0000 UTC m=+0.117574728 container exec_died c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 08 20:10:05 compute-0 systemd[1]: libpod-conmon-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.scope: Deactivated successfully.
Dec 08 20:10:05 compute-0 sudo[206056]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:06 compute-0 sudo[206240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibatvxerxztmarvwguvthspelujtnyxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224606.111995-925-223020103108102/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:06 compute-0 sudo[206240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:06 compute-0 python3.9[206242]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:06 compute-0 systemd[1]: Started libpod-conmon-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.scope.
Dec 08 20:10:06 compute-0 podman[206243]: 2025-12-08 20:10:06.818064008 +0000 UTC m=+0.093097522 container exec c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Dec 08 20:10:06 compute-0 podman[206243]: 2025-12-08 20:10:06.849230899 +0000 UTC m=+0.124264373 container exec_died c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd)
Dec 08 20:10:06 compute-0 systemd[1]: libpod-conmon-c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c.scope: Deactivated successfully.
Dec 08 20:10:06 compute-0 sudo[206240]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:07 compute-0 sudo[206446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gocbpmfvdemhjajqcbwvvidgioxbejri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224607.0911996-933-19738781507192/AnsiballZ_file.py'
Dec 08 20:10:07 compute-0 sudo[206446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:07 compute-0 podman[206399]: 2025-12-08 20:10:07.433915235 +0000 UTC m=+0.074411181 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 08 20:10:07 compute-0 podman[206398]: 2025-12-08 20:10:07.464517979 +0000 UTC m=+0.104789699 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 08 20:10:07 compute-0 python3.9[206460]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:07 compute-0 sudo[206446]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:08 compute-0 sudo[206619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oktttnsafaphvdbazgxrdeybmhngrbbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224607.8957374-942-2408789339336/AnsiballZ_podman_container_info.py'
Dec 08 20:10:08 compute-0 sudo[206619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:08 compute-0 python3.9[206621]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 08 20:10:08 compute-0 sudo[206619]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:09 compute-0 sudo[206784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uupbojokitgbfvyjmaqxznjzfrmdxshs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224608.7692263-950-247073552361998/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:09 compute-0 sudo[206784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:09 compute-0 python3.9[206786]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:09 compute-0 systemd[1]: Started libpod-conmon-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope.
Dec 08 20:10:09 compute-0 podman[206787]: 2025-12-08 20:10:09.429733801 +0000 UTC m=+0.093187914 container exec 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42)
Dec 08 20:10:09 compute-0 podman[206787]: 2025-12-08 20:10:09.465692408 +0000 UTC m=+0.129146511 container exec_died 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 08 20:10:09 compute-0 systemd[1]: libpod-conmon-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope: Deactivated successfully.
Dec 08 20:10:09 compute-0 sudo[206784]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:09 compute-0 sudo[206966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxhhjyfxkxdzntxpdoztneyqsupvsjdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224609.6955214-958-203375086680591/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:09 compute-0 sudo[206966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:10 compute-0 python3.9[206968]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:10 compute-0 systemd[1]: Started libpod-conmon-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope.
Dec 08 20:10:10 compute-0 podman[206969]: 2025-12-08 20:10:10.286712825 +0000 UTC m=+0.096690742 container exec 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0)
Dec 08 20:10:10 compute-0 podman[206969]: 2025-12-08 20:10:10.295129072 +0000 UTC m=+0.105106989 container exec_died 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Dec 08 20:10:10 compute-0 sudo[206966]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:10 compute-0 systemd[1]: libpod-conmon-7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964.scope: Deactivated successfully.
Dec 08 20:10:10 compute-0 sudo[207151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrbiteglokvxxzfjoravecvuahiwrxra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224610.5401866-966-1893760204449/AnsiballZ_file.py'
Dec 08 20:10:10 compute-0 sudo[207151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:10 compute-0 python3.9[207153]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:11 compute-0 sudo[207151]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:11 compute-0 podman[207154]: 2025-12-08 20:10:11.116018294 +0000 UTC m=+0.075778612 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:10:11 compute-0 sudo[207327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlczzbyhbizxatvuyxmzsfuwwdoedukq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224611.2581034-975-73095051241642/AnsiballZ_podman_container_info.py'
Dec 08 20:10:11 compute-0 sudo[207327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:11 compute-0 python3.9[207329]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 08 20:10:11 compute-0 sudo[207327]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:12 compute-0 sudo[207492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywcygalviardjnniomndspkwkslnclxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224612.0127478-983-278689648851267/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:12 compute-0 sudo[207492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:12 compute-0 python3.9[207494]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:12 compute-0 systemd[1]: Started libpod-conmon-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope.
Dec 08 20:10:12 compute-0 podman[207495]: 2025-12-08 20:10:12.701401559 +0000 UTC m=+0.085846590 container exec 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 08 20:10:12 compute-0 podman[207495]: 2025-12-08 20:10:12.735424597 +0000 UTC m=+0.119869578 container exec_died 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:10:12 compute-0 sudo[207492]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:12 compute-0 systemd[1]: libpod-conmon-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope: Deactivated successfully.
Dec 08 20:10:13 compute-0 sudo[207674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzwrzucvhicrtvshcywerlyuvarurwpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224612.9611979-991-207124228229676/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:13 compute-0 sudo[207674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:13 compute-0 python3.9[207676]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:13 compute-0 systemd[1]: Started libpod-conmon-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope.
Dec 08 20:10:13 compute-0 podman[207677]: 2025-12-08 20:10:13.577398052 +0000 UTC m=+0.076988799 container exec 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:10:13 compute-0 podman[207696]: 2025-12-08 20:10:13.637010831 +0000 UTC m=+0.048954525 container exec_died 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:10:13 compute-0 podman[207677]: 2025-12-08 20:10:13.64285131 +0000 UTC m=+0.142442097 container exec_died 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:10:13 compute-0 systemd[1]: libpod-conmon-0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8.scope: Deactivated successfully.
Dec 08 20:10:13 compute-0 sudo[207674]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:14 compute-0 sudo[207858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqeasncjjnuuiidkvlahveoxhpuencqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224613.8677056-999-53297127858353/AnsiballZ_file.py'
Dec 08 20:10:14 compute-0 sudo[207858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:14 compute-0 python3.9[207860]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:14 compute-0 sudo[207858]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:14 compute-0 sudo[208010]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oordonfpreeqjttxqnryiuilphxkcajy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224614.6612222-1008-39007094876845/AnsiballZ_podman_container_info.py'
Dec 08 20:10:14 compute-0 sudo[208010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:15 compute-0 python3.9[208012]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.235 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.258 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 sudo[208010]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:15 compute-0 sudo[208175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obhodspkikghwvflxhwmatngdbvsofos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224615.4761238-1016-103135313989597/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:15 compute-0 sudo[208175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.830 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.830 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.830 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.830 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.831 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.831 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.860 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.860 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.860 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:10:15 compute-0 nova_compute[187787]: 2025-12-08 20:10:15.861 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:10:15 compute-0 python3.9[208177]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:16 compute-0 systemd[1]: Started libpod-conmon-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.scope.
Dec 08 20:10:16 compute-0 podman[208178]: 2025-12-08 20:10:16.076869933 +0000 UTC m=+0.083217959 container exec 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.078 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.079 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5829MB free_disk=72.91721725463867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.079 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.080 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:10:16 compute-0 podman[208178]: 2025-12-08 20:10:16.110231572 +0000 UTC m=+0.116579578 container exec_died 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:10:16 compute-0 systemd[1]: libpod-conmon-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.scope: Deactivated successfully.
Dec 08 20:10:16 compute-0 sudo[208175]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.167 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.167 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.191 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.228 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.254 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:10:16 compute-0 nova_compute[187787]: 2025-12-08 20:10:16.254 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:10:16 compute-0 sudo[208358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqzzilizuvsfrtygtqekydfqgklfvgao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224616.3309312-1024-231990813492340/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:16 compute-0 sudo[208358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:16 compute-0 python3.9[208360]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:16 compute-0 systemd[1]: Started libpod-conmon-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.scope.
Dec 08 20:10:16 compute-0 podman[208361]: 2025-12-08 20:10:16.916647063 +0000 UTC m=+0.080481877 container exec 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:10:16 compute-0 podman[208361]: 2025-12-08 20:10:16.945694309 +0000 UTC m=+0.109529103 container exec_died 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:10:16 compute-0 systemd[1]: libpod-conmon-5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997.scope: Deactivated successfully.
Dec 08 20:10:16 compute-0 sudo[208358]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:17 compute-0 nova_compute[187787]: 2025-12-08 20:10:17.204 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:17 compute-0 nova_compute[187787]: 2025-12-08 20:10:17.204 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:10:17 compute-0 sudo[208543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuitgjguknobopaotzkfodrlrbdcvtqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224617.1589394-1032-137052849916713/AnsiballZ_file.py'
Dec 08 20:10:17 compute-0 sudo[208543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:17 compute-0 python3.9[208545]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:17 compute-0 sudo[208543]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:18 compute-0 sudo[208695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmvibpcatksbgtnabjzruyoksawltvcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224618.1208704-1041-169221929211782/AnsiballZ_podman_container_info.py'
Dec 08 20:10:18 compute-0 sudo[208695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:18 compute-0 python3.9[208697]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 08 20:10:18 compute-0 sudo[208695]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:19 compute-0 sudo[208870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuhufuirrfkmxbbbwvybqvckolmbqdfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224618.8451319-1049-106324163985852/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:19 compute-0 sudo[208870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:19 compute-0 podman[208834]: 2025-12-08 20:10:19.172731358 +0000 UTC m=+0.070594904 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 08 20:10:19 compute-0 python3.9[208877]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:19 compute-0 systemd[1]: Started libpod-conmon-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.scope.
Dec 08 20:10:19 compute-0 podman[208883]: 2025-12-08 20:10:19.441189218 +0000 UTC m=+0.081575960 container exec adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, architecture=x86_64, io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=)
Dec 08 20:10:19 compute-0 podman[208883]: 2025-12-08 20:10:19.476170495 +0000 UTC m=+0.116557227 container exec_died adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, vcs-type=git, config_id=edpm, managed_by=edpm_ansible)
Dec 08 20:10:19 compute-0 systemd[1]: libpod-conmon-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.scope: Deactivated successfully.
Dec 08 20:10:19 compute-0 sudo[208870]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:19 compute-0 sudo[209065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbdcoupsvsgagwsdloiwvesutrjqzhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224619.691999-1057-217878807799758/AnsiballZ_podman_container_exec.py'
Dec 08 20:10:19 compute-0 sudo[209065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:20 compute-0 python3.9[209067]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 08 20:10:20 compute-0 systemd[1]: Started libpod-conmon-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.scope.
Dec 08 20:10:20 compute-0 podman[209068]: 2025-12-08 20:10:20.283813524 +0000 UTC m=+0.079970081 container exec adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Dec 08 20:10:20 compute-0 podman[209068]: 2025-12-08 20:10:20.317381308 +0000 UTC m=+0.113537855 container exec_died adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 08 20:10:20 compute-0 systemd[1]: libpod-conmon-adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8.scope: Deactivated successfully.
Dec 08 20:10:20 compute-0 sudo[209065]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:20 compute-0 sudo[209248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-braqbzhetcfqxrkkvxakmpuxoqzqxyfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224620.561882-1065-76592827492607/AnsiballZ_file.py'
Dec 08 20:10:20 compute-0 sudo[209248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:21 compute-0 python3.9[209250]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:21 compute-0 sudo[209248]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:21 compute-0 sudo[209400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgqktoibqgxdqybpoexceotgoaoxuabd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224621.3407032-1074-170145578137123/AnsiballZ_file.py'
Dec 08 20:10:21 compute-0 sudo[209400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:21 compute-0 python3.9[209402]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:21 compute-0 sudo[209400]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:22 compute-0 sudo[209552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlnacagwamxzzbopxyguogijwpgbxoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224622.1867468-1082-251124886675353/AnsiballZ_stat.py'
Dec 08 20:10:22 compute-0 sudo[209552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:22 compute-0 python3.9[209554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:22 compute-0 sudo[209552]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:23 compute-0 sudo[209675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eorsitxqxqtonlsimbcufppdtdgiqdeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224622.1867468-1082-251124886675353/AnsiballZ_copy.py'
Dec 08 20:10:23 compute-0 sudo[209675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:23 compute-0 python3.9[209677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765224622.1867468-1082-251124886675353/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:23 compute-0 sudo[209675]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:23 compute-0 sshd-session[207548]: error: kex_exchange_identification: read: Connection timed out
Dec 08 20:10:23 compute-0 sshd-session[207548]: banner exchange: Connection from 125.124.149.14 port 43630: Connection timed out
Dec 08 20:10:23 compute-0 sudo[209827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upggorgntbkigsyrjruqsolktrivvwyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224623.553521-1098-45629972296787/AnsiballZ_file.py'
Dec 08 20:10:23 compute-0 sudo[209827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:24 compute-0 python3.9[209829]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:24 compute-0 sudo[209827]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:24 compute-0 sudo[209979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpagovltvmyabtwduvmwsollanxbuyeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224624.2400475-1106-91690631010170/AnsiballZ_stat.py'
Dec 08 20:10:24 compute-0 sudo[209979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:24 compute-0 python3.9[209981]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:24 compute-0 sudo[209979]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:25 compute-0 sudo[210057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdzuisudpxfcweommgufuludpnssqgee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224624.2400475-1106-91690631010170/AnsiballZ_file.py'
Dec 08 20:10:25 compute-0 sudo[210057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:25 compute-0 python3.9[210059]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:25 compute-0 sudo[210057]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:25 compute-0 sudo[210211]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygbcznsoxmtyqwruwsqehvdbqbraqccm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224625.4414756-1118-5487619290775/AnsiballZ_stat.py'
Dec 08 20:10:25 compute-0 sudo[210211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:25 compute-0 python3.9[210213]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:25 compute-0 sudo[210211]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:26 compute-0 sudo[210303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwkhpqgkpzongqkrkznjumvffiaojiai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224625.4414756-1118-5487619290775/AnsiballZ_file.py'
Dec 08 20:10:26 compute-0 sudo[210303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:26 compute-0 podman[210265]: 2025-12-08 20:10:26.383051841 +0000 UTC m=+0.073694089 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, vendor=Red Hat, Inc., config_id=edpm, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 08 20:10:26 compute-0 python3.9[210312]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.263heky_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:26 compute-0 sudo[210303]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:26 compute-0 sshd-session[210060]: Received disconnect from 47.76.127.165 port 45740:11: Bye Bye [preauth]
Dec 08 20:10:26 compute-0 sshd-session[210060]: Disconnected from authenticating user root 47.76.127.165 port 45740 [preauth]
Dec 08 20:10:26 compute-0 sshd-session[210216]: Received disconnect from 193.46.255.244 port 29682:11:  [preauth]
Dec 08 20:10:26 compute-0 sshd-session[210216]: Disconnected from authenticating user root 193.46.255.244 port 29682 [preauth]
Dec 08 20:10:27 compute-0 sudo[210466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hhgurfgoacmytndofnspvbvypwxldjqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224626.7473917-1130-279490480460891/AnsiballZ_stat.py'
Dec 08 20:10:27 compute-0 sudo[210466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:27 compute-0 python3.9[210468]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:27 compute-0 sudo[210466]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:27 compute-0 sudo[210544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phqaczicxqtjocpzfpzsmzydxwrdnjyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224626.7473917-1130-279490480460891/AnsiballZ_file.py'
Dec 08 20:10:27 compute-0 sudo[210544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:27 compute-0 python3.9[210546]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:27 compute-0 sudo[210544]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:28 compute-0 sshd[129409]: Timeout before authentication for connection from 222.172.32.246 to 38.102.83.66, pid = 188825
Dec 08 20:10:28 compute-0 sudo[210696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oemzmkbwkyrbkxugroedfvuzlnkztjbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224627.9014926-1143-272622881854510/AnsiballZ_command.py'
Dec 08 20:10:28 compute-0 sudo[210696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:28 compute-0 python3.9[210698]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:10:28 compute-0 sudo[210696]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:29 compute-0 sudo[210864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jszepiiofxvpghshiqgozcjbexmmkgpu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1765224628.593437-1151-40938806189658/AnsiballZ_edpm_nftables_from_files.py'
Dec 08 20:10:29 compute-0 sudo[210864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:29 compute-0 podman[210825]: 2025-12-08 20:10:29.108086334 +0000 UTC m=+0.055242827 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:10:29 compute-0 python3[210871]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 08 20:10:29 compute-0 sudo[210864]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:29 compute-0 sshd-session[210776]: Invalid user admin from 200.155.38.219 port 19010
Dec 08 20:10:29 compute-0 sshd-session[210776]: Received disconnect from 200.155.38.219 port 19010:11: Bye Bye [preauth]
Dec 08 20:10:29 compute-0 sshd-session[210776]: Disconnected from invalid user admin 200.155.38.219 port 19010 [preauth]
Dec 08 20:10:30 compute-0 sudo[211042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omasflvpvvfggmemqtyonkktysxmgfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224629.4893937-1159-146036454971835/AnsiballZ_stat.py'
Dec 08 20:10:30 compute-0 sudo[211042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:30 compute-0 podman[211001]: 2025-12-08 20:10:30.104628925 +0000 UTC m=+0.057060343 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:10:30 compute-0 python3.9[211048]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:30 compute-0 sudo[211042]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:30 compute-0 sudo[211124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wslukwocucpfquhdjczeijmtqhyqkchn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224629.4893937-1159-146036454971835/AnsiballZ_file.py'
Dec 08 20:10:30 compute-0 sudo[211124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:30 compute-0 python3.9[211126]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:30 compute-0 sudo[211124]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:31 compute-0 sudo[211276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoirweyahogcvvuwwmoczcgearlscggm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224630.931476-1171-81676091717340/AnsiballZ_stat.py'
Dec 08 20:10:31 compute-0 sudo[211276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:31 compute-0 python3.9[211278]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:31 compute-0 sudo[211276]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:31 compute-0 sudo[211354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kizqlzdpfuqlshredcivssfbkhljbqbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224630.931476-1171-81676091717340/AnsiballZ_file.py'
Dec 08 20:10:31 compute-0 sudo[211354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:31 compute-0 python3.9[211356]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:31 compute-0 sudo[211354]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:32 compute-0 sudo[211506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rseakfnxgjeigldhtlgxhgojpfyulcon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224632.171941-1183-66853576759582/AnsiballZ_stat.py'
Dec 08 20:10:32 compute-0 sudo[211506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:32 compute-0 python3.9[211508]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:32 compute-0 sudo[211506]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:32 compute-0 sudo[211584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tdnlhyghxexzkeiqatmwxcvcccuzzqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224632.171941-1183-66853576759582/AnsiballZ_file.py'
Dec 08 20:10:32 compute-0 sudo[211584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:33 compute-0 python3.9[211586]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:33 compute-0 sudo[211584]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:33 compute-0 sudo[211736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-walgvcinehodomvwtmjatyaigbqeoszn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224633.3458793-1195-206793431260389/AnsiballZ_stat.py'
Dec 08 20:10:33 compute-0 sudo[211736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:33 compute-0 python3.9[211738]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:34 compute-0 sudo[211736]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:34 compute-0 sudo[211814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvspwruazzqisiqqujluzqgoanwiccvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224633.3458793-1195-206793431260389/AnsiballZ_file.py'
Dec 08 20:10:34 compute-0 sudo[211814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:34 compute-0 python3.9[211816]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:34 compute-0 sudo[211814]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:35 compute-0 sudo[211966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohjivezqdtkqfgjhlwmwxpjgmvzawxpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224634.6233013-1207-141321217529685/AnsiballZ_stat.py'
Dec 08 20:10:35 compute-0 sudo[211966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:35 compute-0 python3.9[211968]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 08 20:10:35 compute-0 sudo[211966]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:35 compute-0 sudo[212091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opybppstnvrfvyskbjdnpyimyuuaymbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224634.6233013-1207-141321217529685/AnsiballZ_copy.py'
Dec 08 20:10:35 compute-0 sudo[212091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:35 compute-0 python3.9[212093]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765224634.6233013-1207-141321217529685/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:35 compute-0 sudo[212091]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:36 compute-0 sudo[212244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkvnyrxsobrmtusczrwlnkivgledbbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224636.057194-1222-95528804871916/AnsiballZ_file.py'
Dec 08 20:10:36 compute-0 sudo[212244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:36 compute-0 python3.9[212246]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:36 compute-0 sudo[212244]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:37 compute-0 sudo[212396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjxabehnzbpbckxrijhamcqpixzuctby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224636.7508142-1230-93175505761406/AnsiballZ_command.py'
Dec 08 20:10:37 compute-0 sudo[212396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:37 compute-0 python3.9[212398]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:10:37 compute-0 sudo[212396]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:38 compute-0 sudo[212579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqpjnawteyyubkwnsgmtcqletzstgthe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224637.5522535-1238-212569279632910/AnsiballZ_blockinfile.py'
Dec 08 20:10:38 compute-0 sudo[212579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:38 compute-0 podman[212527]: 2025-12-08 20:10:38.036428029 +0000 UTC m=+0.049775512 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, container_name=multipathd, io.buildah.version=1.41.3)
Dec 08 20:10:38 compute-0 podman[212526]: 2025-12-08 20:10:38.063650851 +0000 UTC m=+0.078804339 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Dec 08 20:10:38 compute-0 python3.9[212594]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:38 compute-0 sudo[212579]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:38 compute-0 sudo[212750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csincqbmjanqdsrnkeburgglhafevzak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224638.5072446-1247-118679511880587/AnsiballZ_command.py'
Dec 08 20:10:38 compute-0 sudo[212750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:38 compute-0 python3.9[212752]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:10:38 compute-0 sudo[212750]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:39 compute-0 sudo[212903]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axegridkcxvbxrcgvoiryjozjeiwiidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224639.1603262-1255-195417964499622/AnsiballZ_stat.py'
Dec 08 20:10:39 compute-0 sudo[212903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:39 compute-0 python3.9[212905]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 08 20:10:39 compute-0 sudo[212903]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:40 compute-0 sudo[213057]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eatzridyrqytqzsggqmkhucudnxjuozc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224639.8566792-1263-61628471017542/AnsiballZ_command.py'
Dec 08 20:10:40 compute-0 sudo[213057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:40 compute-0 python3.9[213059]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 08 20:10:40 compute-0 sudo[213057]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:40 compute-0 sudo[213212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piixrduobiycyrcyvajisbvtsoxswtap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1765224640.5286133-1271-110743834805467/AnsiballZ_file.py'
Dec 08 20:10:40 compute-0 sudo[213212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:10:40 compute-0 python3.9[213214]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 08 20:10:40 compute-0 sudo[213212]: pam_unix(sudo:session): session closed for user root
Dec 08 20:10:41 compute-0 podman[213239]: 2025-12-08 20:10:41.493213588 +0000 UTC m=+0.066740117 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:10:41 compute-0 sshd-session[188138]: Connection closed by 192.168.122.30 port 40308
Dec 08 20:10:41 compute-0 sshd-session[188135]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:10:41 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Dec 08 20:10:41 compute-0 systemd[1]: session-25.scope: Consumed 1min 43.100s CPU time.
Dec 08 20:10:41 compute-0 systemd-logind[793]: Session 25 logged out. Waiting for processes to exit.
Dec 08 20:10:41 compute-0 systemd-logind[793]: Removed session 25.
Dec 08 20:10:43 compute-0 sshd-session[212094]: Connection closed by 14.103.173.166 port 57750 [preauth]
Dec 08 20:10:49 compute-0 podman[213267]: 2025-12-08 20:10:49.484966857 +0000 UTC m=+0.056752797 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Dec 08 20:10:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:10:54.983 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:10:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:10:54.984 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:10:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:10:54.984 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:10:56 compute-0 podman[213291]: 2025-12-08 20:10:56.541829359 +0000 UTC m=+0.104741303 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter)
Dec 08 20:10:59 compute-0 podman[213313]: 2025-12-08 20:10:59.479637907 +0000 UTC m=+0.051626638 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:10:59 compute-0 podman[202017]: time="2025-12-08T20:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:10:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:10:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3404 "" "Go-http-client/1.1"
Dec 08 20:11:00 compute-0 podman[213337]: 2025-12-08 20:11:00.490522869 +0000 UTC m=+0.057470769 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: ERROR   20:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: ERROR   20:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: ERROR   20:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: ERROR   20:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: ERROR   20:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:11:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:11:07 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:07.504 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:11:07 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:07.505 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:11:07 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:07.507 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:11:08 compute-0 podman[213361]: 2025-12-08 20:11:08.498264466 +0000 UTC m=+0.056526891 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:11:08 compute-0 podman[213360]: 2025-12-08 20:11:08.532890027 +0000 UTC m=+0.100447819 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller)
Dec 08 20:11:12 compute-0 podman[213406]: 2025-12-08 20:11:12.466981786 +0000 UTC m=+0.044851799 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.778 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.794 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.795 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.795 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.795 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:15 compute-0 nova_compute[187787]: 2025-12-08 20:11:15.795 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.815 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.816 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.817 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:11:17 compute-0 nova_compute[187787]: 2025-12-08 20:11:17.817 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.017 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.018 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5936MB free_disk=72.91748809814453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.018 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.019 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.083 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.083 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.109 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.128 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.129 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:11:18 compute-0 nova_compute[187787]: 2025-12-08 20:11:18.129 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.111s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.833 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.833 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.834 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.834 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.839 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d29af5610>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:11:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:11:20 compute-0 podman[213433]: 2025-12-08 20:11:20.481730599 +0000 UTC m=+0.056893801 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 08 20:11:20 compute-0 sshd-session[213430]: Received disconnect from 103.172.28.62 port 33456:11: Bye Bye [preauth]
Dec 08 20:11:20 compute-0 sshd-session[213430]: Disconnected from authenticating user root 103.172.28.62 port 33456 [preauth]
Dec 08 20:11:27 compute-0 podman[213455]: 2025-12-08 20:11:27.504753854 +0000 UTC m=+0.076815899 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 08 20:11:28 compute-0 sshd-session[213478]: Invalid user anil from 172.96.182.111 port 47470
Dec 08 20:11:28 compute-0 sshd-session[213478]: Received disconnect from 172.96.182.111 port 47470:11: Bye Bye [preauth]
Dec 08 20:11:28 compute-0 sshd-session[213478]: Disconnected from invalid user anil 172.96.182.111 port 47470 [preauth]
Dec 08 20:11:29 compute-0 podman[202017]: time="2025-12-08T20:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:11:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:11:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3411 "" "Go-http-client/1.1"
Dec 08 20:11:30 compute-0 podman[213481]: 2025-12-08 20:11:30.526938364 +0000 UTC m=+0.094036750 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:11:30 compute-0 podman[213507]: 2025-12-08 20:11:30.713621021 +0000 UTC m=+0.142966895 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: ERROR   20:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: ERROR   20:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: ERROR   20:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: ERROR   20:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: ERROR   20:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:11:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:11:39 compute-0 podman[213528]: 2025-12-08 20:11:39.490307808 +0000 UTC m=+0.060649717 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 08 20:11:39 compute-0 podman[213527]: 2025-12-08 20:11:39.514357667 +0000 UTC m=+0.085695118 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:11:43 compute-0 podman[213572]: 2025-12-08 20:11:43.490141892 +0000 UTC m=+0.061976700 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:11:51 compute-0 podman[213598]: 2025-12-08 20:11:51.480305233 +0000 UTC m=+0.050651328 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 08 20:11:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:54.984 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:11:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:54.985 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:11:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:11:54.985 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:11:58 compute-0 podman[213618]: 2025-12-08 20:11:58.48483728 +0000 UTC m=+0.056584882 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Dec 08 20:11:59 compute-0 sshd-session[213596]: ssh_dispatch_run_fatal: Connection from 45.78.228.32 port 51324: Connection timed out [preauth]
Dec 08 20:11:59 compute-0 podman[202017]: time="2025-12-08T20:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:11:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:11:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3410 "" "Go-http-client/1.1"
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: ERROR   20:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: ERROR   20:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: ERROR   20:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: ERROR   20:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: ERROR   20:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:12:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:12:01 compute-0 podman[213640]: 2025-12-08 20:12:01.484582868 +0000 UTC m=+0.048203052 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 08 20:12:01 compute-0 podman[213639]: 2025-12-08 20:12:01.517337638 +0000 UTC m=+0.084315565 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 08 20:12:02 compute-0 sshd-session[213637]: Invalid user anil from 47.76.127.165 port 34324
Dec 08 20:12:02 compute-0 sshd-session[213637]: Received disconnect from 47.76.127.165 port 34324:11: Bye Bye [preauth]
Dec 08 20:12:02 compute-0 sshd-session[213637]: Disconnected from invalid user anil 47.76.127.165 port 34324 [preauth]
Dec 08 20:12:10 compute-0 podman[213683]: 2025-12-08 20:12:10.506374216 +0000 UTC m=+0.082765678 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:12:10 compute-0 podman[213684]: 2025-12-08 20:12:10.510295887 +0000 UTC m=+0.083938374 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.vendor=CentOS)
Dec 08 20:12:14 compute-0 podman[213727]: 2025-12-08 20:12:14.478082494 +0000 UTC m=+0.049146172 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.124 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.879 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.879 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:16 compute-0 nova_compute[187787]: 2025-12-08 20:12:16.880 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:12:17 compute-0 nova_compute[187787]: 2025-12-08 20:12:17.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:17 compute-0 nova_compute[187787]: 2025-12-08 20:12:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:17 compute-0 nova_compute[187787]: 2025-12-08 20:12:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:17 compute-0 nova_compute[187787]: 2025-12-08 20:12:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:17 compute-0 nova_compute[187787]: 2025-12-08 20:12:17.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.112 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.113 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.113 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.113 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.284 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.285 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5953MB free_disk=72.91559600830078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.285 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.286 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.518 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.519 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.550 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.672 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.676 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:12:18 compute-0 nova_compute[187787]: 2025-12-08 20:12:18.676 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.390s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:12:20 compute-0 nova_compute[187787]: 2025-12-08 20:12:20.672 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:20 compute-0 nova_compute[187787]: 2025-12-08 20:12:20.704 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:12:22 compute-0 podman[213754]: 2025-12-08 20:12:22.543794046 +0000 UTC m=+0.102080240 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:12:29 compute-0 sshd-session[213751]: error: kex_exchange_identification: read: Connection timed out
Dec 08 20:12:29 compute-0 sshd-session[213751]: banner exchange: Connection from 125.39.179.192 port 34756: Connection timed out
Dec 08 20:12:29 compute-0 podman[213774]: 2025-12-08 20:12:29.511376322 +0000 UTC m=+0.077876875 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.6, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public)
Dec 08 20:12:29 compute-0 podman[202017]: time="2025-12-08T20:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:12:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:12:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3416 "" "Go-http-client/1.1"
Dec 08 20:12:30 compute-0 sshd-session[213753]: error: kex_exchange_identification: read: Connection timed out
Dec 08 20:12:30 compute-0 sshd-session[213753]: banner exchange: Connection from 117.72.219.136 port 57548: Connection timed out
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: ERROR   20:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: ERROR   20:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: ERROR   20:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: ERROR   20:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: ERROR   20:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:12:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:12:32 compute-0 podman[213796]: 2025-12-08 20:12:32.498323412 +0000 UTC m=+0.063200198 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 08 20:12:32 compute-0 podman[213795]: 2025-12-08 20:12:32.507421196 +0000 UTC m=+0.075738489 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:12:39 compute-0 sshd-session[213837]: Received disconnect from 45.174.162.68 port 33858:11: Bye Bye [preauth]
Dec 08 20:12:39 compute-0 sshd-session[213837]: Disconnected from authenticating user root 45.174.162.68 port 33858 [preauth]
Dec 08 20:12:41 compute-0 podman[213840]: 2025-12-08 20:12:41.522402767 +0000 UTC m=+0.088928785 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 08 20:12:41 compute-0 podman[213839]: 2025-12-08 20:12:41.572678994 +0000 UTC m=+0.140283185 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:12:42 compute-0 sshd-session[213859]: Received disconnect from 172.96.182.111 port 46618:11: Bye Bye [preauth]
Dec 08 20:12:42 compute-0 sshd-session[213859]: Disconnected from authenticating user root 172.96.182.111 port 46618 [preauth]
Dec 08 20:12:44 compute-0 podman[213887]: 2025-12-08 20:12:44.724072026 +0000 UTC m=+0.042914986 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:12:52 compute-0 sshd-session[213911]: Invalid user edge from 103.172.28.62 port 43760
Dec 08 20:12:52 compute-0 sshd-session[213911]: Received disconnect from 103.172.28.62 port 43760:11: Bye Bye [preauth]
Dec 08 20:12:52 compute-0 sshd-session[213911]: Disconnected from invalid user edge 103.172.28.62 port 43760 [preauth]
Dec 08 20:12:53 compute-0 podman[213913]: 2025-12-08 20:12:53.517786678 +0000 UTC m=+0.085036766 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 08 20:12:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:12:54.986 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:12:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:12:54.986 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:12:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:12:54.986 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:12:59 compute-0 podman[202017]: time="2025-12-08T20:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:12:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:12:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3414 "" "Go-http-client/1.1"
Dec 08 20:13:00 compute-0 podman[213933]: 2025-12-08 20:13:00.513276166 +0000 UTC m=+0.082167244 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350)
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: ERROR   20:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: ERROR   20:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: ERROR   20:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: ERROR   20:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: ERROR   20:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:13:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:13:03 compute-0 podman[213955]: 2025-12-08 20:13:03.491136282 +0000 UTC m=+0.055786903 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:13:03 compute-0 podman[213954]: 2025-12-08 20:13:03.529888825 +0000 UTC m=+0.090526359 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:13:12 compute-0 podman[213994]: 2025-12-08 20:13:12.492225516 +0000 UTC m=+0.061098910 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:13:12 compute-0 podman[213993]: 2025-12-08 20:13:12.512719803 +0000 UTC m=+0.081632298 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.805 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.806 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.807 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 08 20:13:14 compute-0 nova_compute[187787]: 2025-12-08 20:13:14.826 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:15 compute-0 podman[214038]: 2025-12-08 20:13:15.46529776 +0000 UTC m=+0.043030889 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:13:15 compute-0 nova_compute[187787]: 2025-12-08 20:13:15.837 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.825 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.826 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.826 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:13:17 compute-0 nova_compute[187787]: 2025-12-08 20:13:17.826 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.015 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.016 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5963MB free_disk=72.91461563110352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.016 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.016 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.148 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.149 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.233 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.252 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.253 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:13:18 compute-0 nova_compute[187787]: 2025-12-08 20:13:18.253 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.237s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.255 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.256 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.256 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.276 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.277 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.278 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:19 compute-0 nova_compute[187787]: 2025-12-08 20:13:19.278 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.833 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.834 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.834 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.835 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.836 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.837 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.837 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.838 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.838 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.839 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.839 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.850 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:13:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:13:20 compute-0 nova_compute[187787]: 2025-12-08 20:13:20.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:13:24 compute-0 podman[214065]: 2025-12-08 20:13:24.506920985 +0000 UTC m=+0.077834428 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 08 20:13:27 compute-0 sshd-session[214085]: Invalid user ubuntu from 47.76.127.165 port 55448
Dec 08 20:13:27 compute-0 sshd-session[214085]: Received disconnect from 47.76.127.165 port 55448:11: Bye Bye [preauth]
Dec 08 20:13:27 compute-0 sshd-session[214085]: Disconnected from invalid user ubuntu 47.76.127.165 port 55448 [preauth]
Dec 08 20:13:29 compute-0 podman[202017]: time="2025-12-08T20:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:13:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:13:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3418 "" "Go-http-client/1.1"
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: ERROR   20:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: ERROR   20:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: ERROR   20:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: ERROR   20:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: ERROR   20:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:13:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:13:31 compute-0 podman[214087]: 2025-12-08 20:13:31.483342913 +0000 UTC m=+0.055576555 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6)
Dec 08 20:13:34 compute-0 podman[214109]: 2025-12-08 20:13:34.48823194 +0000 UTC m=+0.053959744 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:13:34 compute-0 podman[214110]: 2025-12-08 20:13:34.492842736 +0000 UTC m=+0.055505843 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 08 20:13:41 compute-0 sshd-session[214154]: Received disconnect from 200.155.38.219 port 16276:11: Bye Bye [preauth]
Dec 08 20:13:41 compute-0 sshd-session[214154]: Disconnected from authenticating user root 200.155.38.219 port 16276 [preauth]
Dec 08 20:13:42 compute-0 sshd-session[214152]: Received disconnect from 45.78.217.210 port 43304:11: Bye Bye [preauth]
Dec 08 20:13:42 compute-0 sshd-session[214152]: Disconnected from authenticating user root 45.78.217.210 port 43304 [preauth]
Dec 08 20:13:43 compute-0 podman[214157]: 2025-12-08 20:13:43.495600644 +0000 UTC m=+0.065021033 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:13:43 compute-0 podman[214156]: 2025-12-08 20:13:43.568994281 +0000 UTC m=+0.140758394 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Dec 08 20:13:46 compute-0 podman[214202]: 2025-12-08 20:13:46.473818718 +0000 UTC m=+0.047291200 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:13:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:51.642 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:13:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:51.643 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:13:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:53.646 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:13:54 compute-0 sshd-session[214227]: Received disconnect from 172.96.182.111 port 45760:11: Bye Bye [preauth]
Dec 08 20:13:54 compute-0 sshd-session[214227]: Disconnected from authenticating user root 172.96.182.111 port 45760 [preauth]
Dec 08 20:13:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:54.987 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:13:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:54.988 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:13:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:13:54.988 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:13:55 compute-0 podman[214229]: 2025-12-08 20:13:55.511005708 +0000 UTC m=+0.068169309 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true)
Dec 08 20:13:59 compute-0 podman[202017]: time="2025-12-08T20:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:13:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:13:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3424 "" "Go-http-client/1.1"
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: ERROR   20:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: ERROR   20:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: ERROR   20:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: ERROR   20:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: ERROR   20:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:14:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:14:02 compute-0 podman[214249]: 2025-12-08 20:14:02.498744807 +0000 UTC m=+0.063807572 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, architecture=x86_64)
Dec 08 20:14:05 compute-0 podman[214271]: 2025-12-08 20:14:05.486263981 +0000 UTC m=+0.053609656 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Dec 08 20:14:05 compute-0 podman[214270]: 2025-12-08 20:14:05.500926986 +0000 UTC m=+0.074583677 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:14:12 compute-0 sshd-session[214314]: Invalid user zy from 45.174.162.68 port 56634
Dec 08 20:14:12 compute-0 sshd-session[214314]: Received disconnect from 45.174.162.68 port 56634:11: Bye Bye [preauth]
Dec 08 20:14:12 compute-0 sshd-session[214314]: Disconnected from invalid user zy 45.174.162.68 port 56634 [preauth]
Dec 08 20:14:14 compute-0 podman[214317]: 2025-12-08 20:14:14.502811199 +0000 UTC m=+0.064171764 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:14:14 compute-0 podman[214316]: 2025-12-08 20:14:14.55112429 +0000 UTC m=+0.107989805 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 08 20:14:16 compute-0 nova_compute[187787]: 2025-12-08 20:14:16.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:17 compute-0 podman[214362]: 2025-12-08 20:14:17.481760907 +0000 UTC m=+0.056967609 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:14:17 compute-0 nova_compute[187787]: 2025-12-08 20:14:17.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:17 compute-0 nova_compute[187787]: 2025-12-08 20:14:17.813 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:17 compute-0 nova_compute[187787]: 2025-12-08 20:14:17.814 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:17 compute-0 nova_compute[187787]: 2025-12-08 20:14:17.814 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:17 compute-0 nova_compute[187787]: 2025-12-08 20:14:17.815 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.011 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.013 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5970MB free_disk=72.91563034057617GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.013 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.014 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.164 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.165 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.181 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing inventories for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.203 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating ProviderTree inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.203 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.228 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing aggregate associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.259 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing trait associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_FMA3,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.303 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.338 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.339 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:14:18 compute-0 nova_compute[187787]: 2025-12-08 20:14:18.340 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:19 compute-0 nova_compute[187787]: 2025-12-08 20:14:19.340 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:19 compute-0 nova_compute[187787]: 2025-12-08 20:14:19.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:19 compute-0 nova_compute[187787]: 2025-12-08 20:14:19.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:19 compute-0 nova_compute[187787]: 2025-12-08 20:14:19.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.800 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.801 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.801 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.825 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.826 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:20 compute-0 nova_compute[187787]: 2025-12-08 20:14:20.827 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:21 compute-0 nova_compute[187787]: 2025-12-08 20:14:21.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.020 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.021 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.062 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.196 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.197 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.207 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.208 187791 INFO nova.compute.claims [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.378 187791 DEBUG nova.compute.provider_tree [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.396 187791 DEBUG nova.scheduler.client.report [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.424 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.425 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.508 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.509 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.566 187791 INFO nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.603 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.779 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.781 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.782 187791 INFO nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Creating image(s)
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.784 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.784 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.785 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.786 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:23 compute-0 nova_compute[187787]: 2025-12-08 20:14:23.787 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.272 187791 WARNING oslo_policy.policy [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.273 187791 WARNING oslo_policy.policy [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.276 187791 DEBUG nova.policy [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6e0c4248254a4bcb850e5443f0b8ad8b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.283 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.284 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.303 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.402 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.403 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.410 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.410 187791 INFO nova.compute.claims [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.571 187791 DEBUG nova.compute.provider_tree [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.597 187791 DEBUG nova.scheduler.client.report [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.644 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.645 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.753 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.754 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.792 187791 INFO nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.825 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.959 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.960 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.961 187791 INFO nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Creating image(s)
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.961 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.962 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.962 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:24 compute-0 nova_compute[187787]: 2025-12-08 20:14:24.963 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.429 187791 DEBUG nova.policy [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5c734374332f4a18956eedb746b128bf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '71532308007a48d5aef697fbd39501f6', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.744 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.805 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.807 187791 DEBUG nova.virt.images [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] ffae60d8-1843-4b3a-9d11-b077095cedb9 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.808 187791 DEBUG nova.privsep.utils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 08 20:14:25 compute-0 nova_compute[187787]: 2025-12-08 20:14:25.809 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.part /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.007 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.part /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.converted" returned: 0 in 0.199s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.012 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.054 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.054 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.064 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac.converted --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.065 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.076 187791 INFO oslo.privsep.daemon [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpuhq9lpo2/privsep.sock']
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.077 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.115s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.078 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.100 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.184 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.184 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.191 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.191 187791 INFO nova.compute.claims [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.355 187791 DEBUG nova.compute.provider_tree [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:14:26 compute-0 podman[214404]: 2025-12-08 20:14:26.512622541 +0000 UTC m=+0.076941070 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.4, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42)
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.559 187791 ERROR nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [req-a618d76b-54fd-4d85-bb6f-44ec227bbb8c] Failed to update inventory to [{'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID b3899b98-89be-4b90-bd85-9c57a93a16c4.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a618d76b-54fd-4d85-bb6f-44ec227bbb8c"}]}
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.684 187791 DEBUG nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Refreshing inventories for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.705 187791 DEBUG nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating ProviderTree inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.705 187791 DEBUG nova.compute.provider_tree [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.749 187791 DEBUG nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Refreshing aggregate associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.793 187791 INFO oslo.privsep.daemon [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Spawned new privsep daemon via rootwrap
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.796 187791 DEBUG nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Refreshing trait associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_FMA3,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.654 214425 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.658 214425 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.660 214425 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.660 214425 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214425
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.804 187791 WARNING oslo_privsep.priv_context [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] privsep daemon already running
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.889 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.901 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.922 187791 DEBUG nova.compute.provider_tree [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.948 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.949 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.950 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.960 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.981 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:26 compute-0 nova_compute[187787]: 2025-12-08 20:14:26.983 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.000 187791 DEBUG nova.scheduler.client.report [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updated inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with generation 5 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7680, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.001 187791 DEBUG nova.compute.provider_tree [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 generation from 5 to 6 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.001 187791 DEBUG nova.compute.provider_tree [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.022 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.023 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.040 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.855s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.041 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.053 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.054 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.055 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.071 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.082 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.124 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.125 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.130 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.131 187791 DEBUG nova.virt.disk.api [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Checking if we can resize image /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.132 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.153 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.155 187791 INFO nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.159 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.188 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.190 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.190 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.206 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.211 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.212 187791 DEBUG nova.virt.disk.api [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Cannot resize image /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.212 187791 DEBUG nova.objects.instance [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'migration_context' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.242 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.243 187791 DEBUG nova.virt.disk.api [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Checking if we can resize image /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.243 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.283 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Successfully created port: 488a5725-c797-4165-b8ce-319c48f2e8b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.308 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.309 187791 DEBUG nova.virt.disk.api [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Cannot resize image /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.309 187791 DEBUG nova.objects.instance [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 9deed673-fc96-4e81-b9ba-c3f0e83e1625 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.365 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.366 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.367 187791 INFO nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Creating image(s)
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.367 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.368 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.368 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.380 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.381 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Ensure instance console log exists: /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.381 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.382 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.382 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.384 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.402 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.403 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Ensure instance console log exists: /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.404 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.405 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.405 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.437 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.439 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.440 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.462 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.483 187791 DEBUG nova.policy [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd1c8c9756a134cd7a38cb55743f12dad', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '73150461bb354f0fb8f4adf266d52ac8', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.517 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Successfully created port: f25bb8cc-9d43-433e-9e69-63da3f5c18ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.530 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.531 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.576 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.577 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.578 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.655 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.657 187791 DEBUG nova.virt.disk.api [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Checking if we can resize image /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.657 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.740 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.742 187791 DEBUG nova.virt.disk.api [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Cannot resize image /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.743 187791 DEBUG nova.objects.instance [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lazy-loading 'migration_context' on Instance uuid 9f228c07-c6ac-479c-9edb-ceebc19eac87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.762 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.762 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Ensure instance console log exists: /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.763 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.763 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:27 compute-0 nova_compute[187787]: 2025-12-08 20:14:27.763 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.200 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.201 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.228 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.314 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.314 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.320 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.321 187791 INFO nova.compute.claims [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.521 187791 DEBUG nova.compute.provider_tree [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.551 187791 DEBUG nova.scheduler.client.report [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.587 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.273s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.588 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.656 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.657 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.682 187791 INFO nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.718 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.852 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.853 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.853 187791 INFO nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Creating image(s)
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.854 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.854 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.855 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.866 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.920 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.921 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.921 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.932 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.953 187791 DEBUG nova.policy [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '182a1bf5731c443d90e215465b085637', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c09b5e1948374012b56a6b174f13203a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.991 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:28 compute-0 nova_compute[187787]: 2025-12-08 20:14:28.993 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.026 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.027 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.028 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.093 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.094 187791 DEBUG nova.virt.disk.api [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Checking if we can resize image /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.094 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.112 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Successfully created port: af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.156 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.156 187791 DEBUG nova.virt.disk.api [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Cannot resize image /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.157 187791 DEBUG nova.objects.instance [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lazy-loading 'migration_context' on Instance uuid 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.176 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.177 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Ensure instance console log exists: /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.177 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.177 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.178 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.257 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Successfully updated port: f25bb8cc-9d43-433e-9e69-63da3f5c18ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.287 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.288 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquired lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.288 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:14:29 compute-0 podman[202017]: time="2025-12-08T20:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:14:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:14:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3427 "" "Go-http-client/1.1"
Dec 08 20:14:29 compute-0 nova_compute[187787]: 2025-12-08 20:14:29.850 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:14:30 compute-0 nova_compute[187787]: 2025-12-08 20:14:30.304 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Successfully updated port: 488a5725-c797-4165-b8ce-319c48f2e8b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:14:30 compute-0 nova_compute[187787]: 2025-12-08 20:14:30.340 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:30 compute-0 nova_compute[187787]: 2025-12-08 20:14:30.340 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquired lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:30 compute-0 nova_compute[187787]: 2025-12-08 20:14:30.340 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:14:30 compute-0 nova_compute[187787]: 2025-12-08 20:14:30.613 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: ERROR   20:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: ERROR   20:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: ERROR   20:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: ERROR   20:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: ERROR   20:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:14:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:14:31 compute-0 nova_compute[187787]: 2025-12-08 20:14:31.887 187791 DEBUG nova.compute.manager [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-changed-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:31 compute-0 nova_compute[187787]: 2025-12-08 20:14:31.888 187791 DEBUG nova.compute.manager [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Refreshing instance network info cache due to event network-changed-f25bb8cc-9d43-433e-9e69-63da3f5c18ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:31 compute-0 nova_compute[187787]: 2025-12-08 20:14:31.888 187791 DEBUG oslo_concurrency.lockutils [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.096 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Successfully created port: 15884d80-a050-45e4-a91f-aa0953fde76b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.364 187791 DEBUG nova.network.neutron [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updating instance_info_cache with network_info: [{"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.394 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Releasing lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.394 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Instance network_info: |[{"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.394 187791 DEBUG oslo_concurrency.lockutils [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.395 187791 DEBUG nova.network.neutron [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Refreshing network info cache for port f25bb8cc-9d43-433e-9e69-63da3f5c18ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.398 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Start _get_guest_xml network_info=[{"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.403 187791 WARNING nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.408 187791 DEBUG nova.virt.libvirt.host [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.408 187791 DEBUG nova.virt.libvirt.host [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.412 187791 DEBUG nova.virt.libvirt.host [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.413 187791 DEBUG nova.virt.libvirt.host [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.414 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.414 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.414 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.414 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.415 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.415 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.415 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.415 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.415 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.416 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.416 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.416 187791 DEBUG nova.virt.hardware [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.419 187791 DEBUG nova.privsep.utils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.420 187791 DEBUG nova.virt.libvirt.vif [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-853585639',display_name='tempest-ServersTestJSON-server-853585639',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-853585639',id=2,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaUFVSmdjdFM6Nr7JK2ENZ5QSAJNbmdFQI+NWSK4PwXWFNq/zuvL0KeEeiTQrSzUIFjT/xX/I392m1qXIMa6vFo4a9mq+fgSkfVNn+Pzisv0GZRCLUidzu4kPnQkU1Zyg==',key_name='tempest-keypair-321994246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71532308007a48d5aef697fbd39501f6',ramdisk_id='',reservation_id='r-s3d9t0l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-27859848',owner_user_name='tempest-ServersTestJSON-27859848-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5c734374332f4a18956eedb746b128bf',uuid=9deed673-fc96-4e81-b9ba-c3f0e83e1625,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.421 187791 DEBUG nova.network.os_vif_util [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converting VIF {"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.421 187791 DEBUG nova.network.os_vif_util [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.423 187791 DEBUG nova.objects.instance [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9deed673-fc96-4e81-b9ba-c3f0e83e1625 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.448 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <uuid>9deed673-fc96-4e81-b9ba-c3f0e83e1625</uuid>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <name>instance-00000002</name>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:name>tempest-ServersTestJSON-server-853585639</nova:name>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:14:32</nova:creationTime>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:user uuid="5c734374332f4a18956eedb746b128bf">tempest-ServersTestJSON-27859848-project-member</nova:user>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:project uuid="71532308007a48d5aef697fbd39501f6">tempest-ServersTestJSON-27859848</nova:project>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:port uuid="f25bb8cc-9d43-433e-9e69-63da3f5c18ad">
Dec 08 20:14:32 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <system>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="serial">9deed673-fc96-4e81-b9ba-c3f0e83e1625</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="uuid">9deed673-fc96-4e81-b9ba-c3f0e83e1625</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </system>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <os>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </os>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <features>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </features>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.config"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:02:16:36"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="tapf25bb8cc-9d"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/console.log" append="off"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <video>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </video>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:14:32 compute-0 nova_compute[187787]: </domain>
Dec 08 20:14:32 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.449 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Preparing to wait for external event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.449 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.449 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.450 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.450 187791 DEBUG nova.virt.libvirt.vif [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestJSON-server-853585639',display_name='tempest-ServersTestJSON-server-853585639',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-853585639',id=2,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaUFVSmdjdFM6Nr7JK2ENZ5QSAJNbmdFQI+NWSK4PwXWFNq/zuvL0KeEeiTQrSzUIFjT/xX/I392m1qXIMa6vFo4a9mq+fgSkfVNn+Pzisv0GZRCLUidzu4kPnQkU1Zyg==',key_name='tempest-keypair-321994246',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='71532308007a48d5aef697fbd39501f6',ramdisk_id='',reservation_id='r-s3d9t0l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestJSON-27859848',owner_user_name='tempest-ServersTestJSON-27859848-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:24Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5c734374332f4a18956eedb746b128bf',uuid=9deed673-fc96-4e81-b9ba-c3f0e83e1625,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.451 187791 DEBUG nova.network.os_vif_util [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converting VIF {"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.451 187791 DEBUG nova.network.os_vif_util [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.452 187791 DEBUG os_vif [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.493 187791 DEBUG ovsdbapp.backend.ovs_idl [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.493 187791 DEBUG ovsdbapp.backend.ovs_idl [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.494 187791 DEBUG ovsdbapp.backend.ovs_idl [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.494 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.495 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [POLLOUT] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.496 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.497 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Successfully updated port: af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.499 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.500 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.503 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.515 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.516 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.516 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.517 187791 INFO oslo.privsep.daemon [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpy2_t8sz4/privsep.sock']
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.595 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.596 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.596 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.865 187791 DEBUG nova.network.neutron [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.889 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Releasing lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.890 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance network_info: |[{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.892 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start _get_guest_xml network_info=[{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.897 187791 WARNING nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.904 187791 DEBUG nova.virt.libvirt.host [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.905 187791 DEBUG nova.virt.libvirt.host [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.908 187791 DEBUG nova.compute.manager [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-changed-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.908 187791 DEBUG nova.compute.manager [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Refreshing instance network info cache due to event network-changed-488a5725-c797-4165-b8ce-319c48f2e8b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.909 187791 DEBUG oslo_concurrency.lockutils [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.909 187791 DEBUG oslo_concurrency.lockutils [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.909 187791 DEBUG nova.network.neutron [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Refreshing network info cache for port 488a5725-c797-4165-b8ce-319c48f2e8b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.914 187791 DEBUG nova.virt.libvirt.host [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.915 187791 DEBUG nova.virt.libvirt.host [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.915 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.915 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.916 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.916 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.916 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.916 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.917 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.917 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.917 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.917 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.917 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.918 187791 DEBUG nova.virt.hardware [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.921 187791 DEBUG nova.virt.libvirt.vif [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.921 187791 DEBUG nova.network.os_vif_util [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.922 187791 DEBUG nova.network.os_vif_util [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.923 187791 DEBUG nova.objects.instance [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.942 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <uuid>31b38d28-b90e-434c-9967-912987aee08b</uuid>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <name>instance-00000001</name>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:name>tempest-ServerActionsTestJSON-server-662159316</nova:name>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:14:32</nova:creationTime>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:user uuid="6e0c4248254a4bcb850e5443f0b8ad8b">tempest-ServerActionsTestJSON-1895647313-project-member</nova:user>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:project uuid="ad1a4d6aebb84f6fb894551cd68d2ae1">tempest-ServerActionsTestJSON-1895647313</nova:project>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         <nova:port uuid="488a5725-c797-4165-b8ce-319c48f2e8b8">
Dec 08 20:14:32 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <system>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="serial">31b38d28-b90e-434c-9967-912987aee08b</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="uuid">31b38d28-b90e-434c-9967-912987aee08b</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </system>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <os>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </os>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <features>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </features>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:6c:cd:5c"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <target dev="tap488a5725-c7"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/console.log" append="off"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <video>
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </video>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:14:32 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:14:32 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:14:32 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:14:32 compute-0 nova_compute[187787]: </domain>
Dec 08 20:14:32 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.944 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Preparing to wait for external event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.944 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.944 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.945 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.945 187791 DEBUG nova.virt.libvirt.vif [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.945 187791 DEBUG nova.network.os_vif_util [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.946 187791 DEBUG nova.network.os_vif_util [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.947 187791 DEBUG os_vif [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.947 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.948 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:32 compute-0 nova_compute[187787]: 2025-12-08 20:14:32.948 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.071 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.196 187791 INFO oslo.privsep.daemon [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Spawned new privsep daemon via rootwrap
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.066 214492 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.072 214492 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.075 214492 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.075 214492 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214492
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.199 187791 WARNING oslo_privsep.priv_context [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] privsep daemon already running
Dec 08 20:14:33 compute-0 podman[214497]: 2025-12-08 20:14:33.48451868 +0000 UTC m=+0.053862004 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.554 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.555 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf25bb8cc-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.555 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf25bb8cc-9d, col_values=(('external_ids', {'iface-id': 'f25bb8cc-9d43-433e-9e69-63da3f5c18ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:16:36', 'vm-uuid': '9deed673-fc96-4e81-b9ba-c3f0e83e1625'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.569 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 NetworkManager[56229]: <info>  [1765224873.5706] manager: (tapf25bb8cc-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.573 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.576 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.578 187791 INFO os_vif [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d')
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.579 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.579 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap488a5725-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.580 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap488a5725-c7, col_values=(('external_ids', {'iface-id': '488a5725-c797-4165-b8ce-319c48f2e8b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:cd:5c', 'vm-uuid': '31b38d28-b90e-434c-9967-912987aee08b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.581 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 NetworkManager[56229]: <info>  [1765224873.5822] manager: (tap488a5725-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.584 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.588 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.589 187791 INFO os_vif [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7')
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.727 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.728 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.728 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] No VIF found with MAC fa:16:3e:6c:cd:5c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.728 187791 INFO nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Using config drive
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.837 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.838 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.838 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] No VIF found with MAC fa:16:3e:02:16:36, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:14:33 compute-0 nova_compute[187787]: 2025-12-08 20:14:33.839 187791 INFO nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Using config drive
Dec 08 20:14:34 compute-0 nova_compute[187787]: 2025-12-08 20:14:34.943 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Successfully updated port: 15884d80-a050-45e4-a91f-aa0953fde76b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:14:34 compute-0 nova_compute[187787]: 2025-12-08 20:14:34.957 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:34 compute-0 nova_compute[187787]: 2025-12-08 20:14:34.958 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquired lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:34 compute-0 nova_compute[187787]: 2025-12-08 20:14:34.958 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.254 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.275 187791 DEBUG nova.network.neutron [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.501 187791 DEBUG nova.network.neutron [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updated VIF entry in instance network info cache for port f25bb8cc-9d43-433e-9e69-63da3f5c18ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.501 187791 DEBUG nova.network.neutron [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updating instance_info_cache with network_info: [{"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.568 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.568 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance network_info: |[{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.568 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.571 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Start _get_guest_xml network_info=[{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.576 187791 WARNING nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.582 187791 DEBUG oslo_concurrency.lockutils [req-6773e531-d3e7-47e9-bfca-cf7606ed5bdf req-ccf4ac70-cfd0-41fd-868e-ea6e47fa83f0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.583 187791 DEBUG nova.virt.libvirt.host [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.583 187791 DEBUG nova.virt.libvirt.host [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.589 187791 DEBUG nova.virt.libvirt.host [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.589 187791 DEBUG nova.virt.libvirt.host [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.590 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.590 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.591 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.591 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.591 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.592 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.592 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.592 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.592 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.593 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.593 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.594 187791 DEBUG nova.virt.hardware [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.597 187791 DEBUG nova.virt.libvirt.vif [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1776949335',display_name='tempest-AttachInterfacesUnderV243Test-server-1776949335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1776949335',id=3,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHUXKCLdMozg5QacQQMDuqiDmsucELgm2tHYaW7qA0wQwZkWq396yhR854vKmw7vKOgqLiXWTJUWaZ6YubHL3+CGLnFRSVD05P9VGevvVx9c62su/GSL+XmKXtiO6ODbEg==',key_name='tempest-keypair-1472805335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73150461bb354f0fb8f4adf266d52ac8',ramdisk_id='',reservation_id='r-rkf8jk67',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1745682393',owner_user_name='tempest-AttachInterfacesUnderV243Test-1745682393-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d1c8c9756a134cd7a38cb55743f12dad',uuid=9f228c07-c6ac-479c-9edb-ceebc19eac87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.598 187791 DEBUG nova.network.os_vif_util [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converting VIF {"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.599 187791 DEBUG nova.network.os_vif_util [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.600 187791 DEBUG nova.objects.instance [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9f228c07-c6ac-479c-9edb-ceebc19eac87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.617 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <uuid>9f228c07-c6ac-479c-9edb-ceebc19eac87</uuid>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <name>instance-00000003</name>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:name>tempest-AttachInterfacesUnderV243Test-server-1776949335</nova:name>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:14:35</nova:creationTime>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:user uuid="d1c8c9756a134cd7a38cb55743f12dad">tempest-AttachInterfacesUnderV243Test-1745682393-project-member</nova:user>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:project uuid="73150461bb354f0fb8f4adf266d52ac8">tempest-AttachInterfacesUnderV243Test-1745682393</nova:project>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         <nova:port uuid="af16ddfd-01f8-4225-96a8-8ec9a5aa19ba">
Dec 08 20:14:35 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <system>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="serial">9f228c07-c6ac-479c-9edb-ceebc19eac87</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="uuid">9f228c07-c6ac-479c-9edb-ceebc19eac87</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </system>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <os>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </os>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <features>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </features>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.config"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:d3:0b:70"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <target dev="tapaf16ddfd-01"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/console.log" append="off"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <video>
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </video>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:14:35 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:14:35 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:14:35 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:14:35 compute-0 nova_compute[187787]: </domain>
Dec 08 20:14:35 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.619 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Preparing to wait for external event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.619 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.619 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.619 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.620 187791 DEBUG nova.virt.libvirt.vif [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1776949335',display_name='tempest-AttachInterfacesUnderV243Test-server-1776949335',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1776949335',id=3,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHUXKCLdMozg5QacQQMDuqiDmsucELgm2tHYaW7qA0wQwZkWq396yhR854vKmw7vKOgqLiXWTJUWaZ6YubHL3+CGLnFRSVD05P9VGevvVx9c62su/GSL+XmKXtiO6ODbEg==',key_name='tempest-keypair-1472805335',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='73150461bb354f0fb8f4adf266d52ac8',ramdisk_id='',reservation_id='r-rkf8jk67',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesUnderV243Test-1745682393',owner_user_name='tempest-AttachInterfacesUnderV243Test-1745682393-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:27Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d1c8c9756a134cd7a38cb55743f12dad',uuid=9f228c07-c6ac-479c-9edb-ceebc19eac87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.621 187791 DEBUG nova.network.os_vif_util [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converting VIF {"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.621 187791 DEBUG nova.network.os_vif_util [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.622 187791 DEBUG os_vif [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.622 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.623 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.623 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.626 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.627 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf16ddfd-01, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.627 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf16ddfd-01, col_values=(('external_ids', {'iface-id': 'af16ddfd-01f8-4225-96a8-8ec9a5aa19ba', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d3:0b:70', 'vm-uuid': '9f228c07-c6ac-479c-9edb-ceebc19eac87'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.628 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.6292] manager: (tapaf16ddfd-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.631 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.643 187791 INFO nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Creating config drive at /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.648 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx2jw5r98 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.666 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.668 187791 INFO os_vif [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01')
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.742 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.742 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.743 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] No VIF found with MAC fa:16:3e:d3:0b:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.743 187791 INFO nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Using config drive
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.768 187791 INFO nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Creating config drive at /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.config
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.774 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfav9akt3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.789 187791 DEBUG oslo_concurrency.processutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx2jw5r98" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:35 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.8850] manager: (tap488a5725-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Dec 08 20:14:35 compute-0 kernel: tap488a5725-c7: entered promiscuous mode
Dec 08 20:14:35 compute-0 ovn_controller[96170]: 2025-12-08T20:14:35Z|00027|binding|INFO|Claiming lport 488a5725-c797-4165-b8ce-319c48f2e8b8 for this chassis.
Dec 08 20:14:35 compute-0 ovn_controller[96170]: 2025-12-08T20:14:35Z|00028|binding|INFO|488a5725-c797-4165-b8ce-319c48f2e8b8: Claiming fa:16:3e:6c:cd:5c 10.100.0.4
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.896 187791 DEBUG nova.network.neutron [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updated VIF entry in instance network info cache for port 488a5725-c797-4165-b8ce-319c48f2e8b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.897 187791 DEBUG nova.network.neutron [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.899 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.901 187791 DEBUG oslo_concurrency.processutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfav9akt3" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:35 compute-0 podman[214536]: 2025-12-08 20:14:35.926921144 +0000 UTC m=+0.066816646 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:14:35 compute-0 podman[214537]: 2025-12-08 20:14:35.934916953 +0000 UTC m=+0.071341578 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.935 187791 DEBUG oslo_concurrency.lockutils [req-ddb5e470-83e9-4f90-8f05-8f501a283d95 req-8aff7210-d78a-4151-b46a-9424904d3c89 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:35 compute-0 systemd-udevd[214594]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:14:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:35.951 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:cd:5c 10.100.0.4'], port_security=['fa:16:3e:6c:cd:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '31b38d28-b90e-434c-9967-912987aee08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d525409-d812-4dc1-bb50-a782007ffe4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bcdae06f-5d1b-4090-b312-569e33431ebf, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=488a5725-c797-4165-b8ce-319c48f2e8b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:35.952 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 488a5725-c797-4165-b8ce-319c48f2e8b8 in datapath f378b9ae-fe6a-498a-b0ea-0d98aea69001 bound to our chassis
Dec 08 20:14:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:35.956 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:14:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:35.957 105024 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmphe9o3q3y/privsep.sock']
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.9584] device (tap488a5725-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.9592] device (tap488a5725-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:14:35 compute-0 systemd-machined[154122]: New machine qemu-1-instance-00000001.
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.9766] manager: (tapf25bb8cc-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Dec 08 20:14:35 compute-0 systemd-udevd[214596]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:14:35 compute-0 kernel: tapf25bb8cc-9d: entered promiscuous mode
Dec 08 20:14:35 compute-0 ovn_controller[96170]: 2025-12-08T20:14:35Z|00029|binding|INFO|Claiming lport f25bb8cc-9d43-433e-9e69-63da3f5c18ad for this chassis.
Dec 08 20:14:35 compute-0 ovn_controller[96170]: 2025-12-08T20:14:35Z|00030|binding|INFO|f25bb8cc-9d43-433e-9e69-63da3f5c18ad: Claiming fa:16:3e:02:16:36 10.100.0.9
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.982 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 nova_compute[187787]: 2025-12-08 20:14:35.987 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:35 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.9915] device (tapf25bb8cc-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:14:35 compute-0 NetworkManager[56229]: <info>  [1765224875.9925] device (tapf25bb8cc-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:14:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:35.996 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:36 10.100.0.9'], port_security=['fa:16:3e:02:16:36 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9deed673-fc96-4e81-b9ba-c3f0e83e1625', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71532308007a48d5aef697fbd39501f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '15728c5a-0648-40f0-b261-fafccf1978da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a9d363-1573-4efe-9e17-2aefd8a246e5, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=f25bb8cc-9d43-433e-9e69-63da3f5c18ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00031|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 ovn-installed in OVS
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00032|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 up in Southbound
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.003 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:36 compute-0 systemd-machined[154122]: New machine qemu-2-instance-00000002.
Dec 08 20:14:36 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00033|binding|INFO|Setting lport f25bb8cc-9d43-433e-9e69-63da3f5c18ad ovn-installed in OVS
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00034|binding|INFO|Setting lport f25bb8cc-9d43-433e-9e69-63da3f5c18ad up in Southbound
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.052 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.225 187791 INFO nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Creating config drive at /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.config
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.231 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmm_lvak4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.284 187791 DEBUG nova.compute.manager [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.286 187791 DEBUG nova.compute.manager [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing instance network info cache due to event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.286 187791 DEBUG oslo_concurrency.lockutils [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.286 187791 DEBUG oslo_concurrency.lockutils [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.286 187791 DEBUG nova.network.neutron [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.356 187791 DEBUG oslo_concurrency.processutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmm_lvak4" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:36 compute-0 kernel: tapaf16ddfd-01: entered promiscuous mode
Dec 08 20:14:36 compute-0 NetworkManager[56229]: <info>  [1765224876.4165] manager: (tapaf16ddfd-01): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00035|binding|INFO|Claiming lport af16ddfd-01f8-4225-96a8-8ec9a5aa19ba for this chassis.
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00036|binding|INFO|af16ddfd-01f8-4225-96a8-8ec9a5aa19ba: Claiming fa:16:3e:d3:0b:70 10.100.0.10
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.428 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.437 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:0b:70 10.100.0.10'], port_security=['fa:16:3e:d3:0b:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9f228c07-c6ac-479c-9edb-ceebc19eac87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66073b60-2cee-4d92-b656-15d29787b3b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73150461bb354f0fb8f4adf266d52ac8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '128a4b39-b2dc-478a-8a31-e528cec44116', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9088ad92-ddfd-4933-9885-66eab30c7262, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:36 compute-0 NetworkManager[56229]: <info>  [1765224876.4407] device (tapaf16ddfd-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:14:36 compute-0 NetworkManager[56229]: <info>  [1765224876.4421] device (tapaf16ddfd-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:14:36 compute-0 systemd-machined[154122]: New machine qemu-3-instance-00000003.
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00037|binding|INFO|Setting lport af16ddfd-01f8-4225-96a8-8ec9a5aa19ba ovn-installed in OVS
Dec 08 20:14:36 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000003.
Dec 08 20:14:36 compute-0 ovn_controller[96170]: 2025-12-08T20:14:36Z|00038|binding|INFO|Setting lport af16ddfd-01f8-4225-96a8-8ec9a5aa19ba up in Southbound
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.491 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.581 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224876.5806234, 9deed673-fc96-4e81-b9ba-c3f0e83e1625 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.583 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] VM Started (Lifecycle Event)
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.625 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.629 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224876.5820305, 9deed673-fc96-4e81-b9ba-c3f0e83e1625 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.630 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] VM Paused (Lifecycle Event)
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.661 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.664 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.700 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.701 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224876.5834856, 31b38d28-b90e-434c-9967-912987aee08b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.701 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Started (Lifecycle Event)
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.723 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.726 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224876.5837524, 31b38d28-b90e-434c-9967-912987aee08b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.727 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Paused (Lifecycle Event)
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.734 105024 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.735 105024 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmphe9o3q3y/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.539 214668 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.552 214668 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.555 214668 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.555 214668 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214668
Dec 08 20:14:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:36.737 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[10e6f7f0-3b8e-4cd2-bd06-03241477d408]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.749 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.753 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:36 compute-0 nova_compute[187787]: 2025-12-08 20:14:36.805 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.146 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224877.1463106, 9f228c07-c6ac-479c-9edb-ceebc19eac87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.147 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] VM Started (Lifecycle Event)
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.185 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.189 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224877.1468897, 9f228c07-c6ac-479c-9edb-ceebc19eac87 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.190 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] VM Paused (Lifecycle Event)
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.216 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.218 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.325 214668 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.325 214668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.325 214668 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.337 187791 DEBUG nova.network.neutron [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updating instance_info_cache with network_info: [{"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.369 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.383 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Releasing lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.383 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Instance network_info: |[{"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.385 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Start _get_guest_xml network_info=[{"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.388 187791 WARNING nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.392 187791 DEBUG nova.virt.libvirt.host [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.393 187791 DEBUG nova.virt.libvirt.host [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.395 187791 DEBUG nova.virt.libvirt.host [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.396 187791 DEBUG nova.virt.libvirt.host [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.396 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.396 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.396 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.399 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.399 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.400 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.400 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.400 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.400 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.401 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.401 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.401 187791 DEBUG nova.virt.hardware [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.404 187791 DEBUG nova.virt.libvirt.vif [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1919705409',display_name='tempest-ServersTestManualDisk-server-1919705409',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1919705409',id=4,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMy7CWmi+A6syNCS8/WYtOKv7Vux4U24DOqxdQ+/7mPhbR66uGjy0QZAvUEJ1ZAAj6PhlIZqclMmdcMZZSllbJ5nImidjRemuKe7PQUnZX4pp7rsSfIX0hlGi1ppXkAlPg==',key_name='tempest-keypair-786640513',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09b5e1948374012b56a6b174f13203a',ramdisk_id='',reservation_id='r-inpxfaxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1603475250',owner_user_name='tempest-ServersTestManualDisk-1603475250-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='182a1bf5731c443d90e215465b085637',uuid=1e4936a4-4e9a-45e3-9bdb-bd423abc6045,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.404 187791 DEBUG nova.network.os_vif_util [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converting VIF {"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.405 187791 DEBUG nova.network.os_vif_util [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.406 187791 DEBUG nova.objects.instance [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.431 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <uuid>1e4936a4-4e9a-45e3-9bdb-bd423abc6045</uuid>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <name>instance-00000004</name>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:name>tempest-ServersTestManualDisk-server-1919705409</nova:name>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:14:37</nova:creationTime>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:user uuid="182a1bf5731c443d90e215465b085637">tempest-ServersTestManualDisk-1603475250-project-member</nova:user>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:project uuid="c09b5e1948374012b56a6b174f13203a">tempest-ServersTestManualDisk-1603475250</nova:project>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         <nova:port uuid="15884d80-a050-45e4-a91f-aa0953fde76b">
Dec 08 20:14:37 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <system>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="serial">1e4936a4-4e9a-45e3-9bdb-bd423abc6045</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="uuid">1e4936a4-4e9a-45e3-9bdb-bd423abc6045</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </system>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <os>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </os>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <features>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </features>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.config"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:d7:2d:37"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <target dev="tap15884d80-a0"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/console.log" append="off"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <video>
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </video>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:14:37 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:14:37 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:14:37 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:14:37 compute-0 nova_compute[187787]: </domain>
Dec 08 20:14:37 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.432 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Preparing to wait for external event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.432 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.433 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.433 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.434 187791 DEBUG nova.virt.libvirt.vif [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1919705409',display_name='tempest-ServersTestManualDisk-server-1919705409',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1919705409',id=4,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMy7CWmi+A6syNCS8/WYtOKv7Vux4U24DOqxdQ+/7mPhbR66uGjy0QZAvUEJ1ZAAj6PhlIZqclMmdcMZZSllbJ5nImidjRemuKe7PQUnZX4pp7rsSfIX0hlGi1ppXkAlPg==',key_name='tempest-keypair-786640513',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c09b5e1948374012b56a6b174f13203a',ramdisk_id='',reservation_id='r-inpxfaxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestManualDisk-1603475250',owner_user_name='tempest-ServersTestManualDisk-1603475250-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:14:28Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='182a1bf5731c443d90e215465b085637',uuid=1e4936a4-4e9a-45e3-9bdb-bd423abc6045,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.434 187791 DEBUG nova.network.os_vif_util [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converting VIF {"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.434 187791 DEBUG nova.network.os_vif_util [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.435 187791 DEBUG os_vif [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.435 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.436 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.436 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.438 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.438 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15884d80-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.439 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap15884d80-a0, col_values=(('external_ids', {'iface-id': '15884d80-a050-45e4-a91f-aa0953fde76b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:2d:37', 'vm-uuid': '1e4936a4-4e9a-45e3-9bdb-bd423abc6045'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.440 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:37 compute-0 NetworkManager[56229]: <info>  [1765224877.4415] manager: (tap15884d80-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.442 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.445 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.447 187791 INFO os_vif [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0')
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.509 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.510 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.510 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] No VIF found with MAC fa:16:3e:d7:2d:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:14:37 compute-0 nova_compute[187787]: 2025-12-08 20:14:37.511 187791 INFO nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Using config drive
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.949 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0481b13a-d04f-4fd5-8f89-0bbadc04999a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.950 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf378b9ae-f1 in ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.952 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf378b9ae-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.952 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2bdbe05e-2ea7-41d6-9c9f-d31b49e15b14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.955 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[cf3aa814-c761-4b03-880e-2ba2be85feba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.974 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[9929f052-5d68-4730-ab91-7ec348f1f968]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.996 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[557c4da3-9185-4381-b88e-70ba9cfdd381]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:37 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:37.999 105024 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmplwt79l38/privsep.sock']
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.608 187791 INFO nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Creating config drive at /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.config
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.616 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwu2j0efe execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.681 105024 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.683 105024 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplwt79l38/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.561 214695 INFO oslo.privsep.daemon [-] privsep daemon starting
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.566 214695 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.568 214695 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.569 214695 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214695
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.685 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4a852d-0d5e-487a-bb7f-0335ad9db752]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.749 187791 DEBUG oslo_concurrency.processutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwu2j0efe" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.760 187791 DEBUG nova.network.neutron [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updated VIF entry in instance network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.761 187791 DEBUG nova.network.neutron [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.795 187791 DEBUG oslo_concurrency.lockutils [req-3ec27fb0-fdc7-43ba-af1e-ee68c4b4e64f req-6408dad1-fe64-49ae-85a9-6099f164911b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:38 compute-0 kernel: tap15884d80-a0: entered promiscuous mode
Dec 08 20:14:38 compute-0 NetworkManager[56229]: <info>  [1765224878.8062] manager: (tap15884d80-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Dec 08 20:14:38 compute-0 ovn_controller[96170]: 2025-12-08T20:14:38Z|00039|binding|INFO|Claiming lport 15884d80-a050-45e4-a91f-aa0953fde76b for this chassis.
Dec 08 20:14:38 compute-0 ovn_controller[96170]: 2025-12-08T20:14:38Z|00040|binding|INFO|15884d80-a050-45e4-a91f-aa0953fde76b: Claiming fa:16:3e:d7:2d:37 10.100.0.5
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.808 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.821 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:38 compute-0 NetworkManager[56229]: <info>  [1765224878.8279] device (tap15884d80-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:14:38 compute-0 NetworkManager[56229]: <info>  [1765224878.8288] device (tap15884d80-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:14:38 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:38.830 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:2d:37 10.100.0.5'], port_security=['fa:16:3e:d7:2d:37 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1e4936a4-4e9a-45e3-9bdb-bd423abc6045', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09b5e1948374012b56a6b174f13203a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8587a849-6ae8-41e1-b943-a03fa78a52c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41fe0558-284a-4ff7-a89b-c230752e9d97, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=15884d80-a050-45e4-a91f-aa0953fde76b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:38 compute-0 systemd-machined[154122]: New machine qemu-4-instance-00000004.
Dec 08 20:14:38 compute-0 nova_compute[187787]: 2025-12-08 20:14:38.866 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:38 compute-0 ovn_controller[96170]: 2025-12-08T20:14:38Z|00041|binding|INFO|Setting lport 15884d80-a050-45e4-a91f-aa0953fde76b ovn-installed in OVS
Dec 08 20:14:38 compute-0 ovn_controller[96170]: 2025-12-08T20:14:38Z|00042|binding|INFO|Setting lport 15884d80-a050-45e4-a91f-aa0953fde76b up in Southbound
Dec 08 20:14:38 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.200 214695 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.200 214695 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.200 214695 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.219 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224879.218298, 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.219 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] VM Started (Lifecycle Event)
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.259 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.263 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-changed-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.264 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Refreshing instance network info cache due to event network-changed-15884d80-a050-45e4-a91f-aa0953fde76b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.264 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.264 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.264 187791 DEBUG nova.network.neutron [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Refreshing network info cache for port 15884d80-a050-45e4-a91f-aa0953fde76b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.270 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224879.218577, 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.270 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] VM Paused (Lifecycle Event)
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.322 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.332 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:39 compute-0 nova_compute[187787]: 2025-12-08 20:14:39.385 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.788 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e8145d-3863-4787-a26d-a7acb7e0b210]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.805 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2e0793f6-a708-4653-96ef-487f2b963327]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 NetworkManager[56229]: <info>  [1765224879.8070] manager: (tapf378b9ae-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Dec 08 20:14:39 compute-0 systemd-udevd[214737]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.834 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[bf095169-e070-46e7-88a0-f0599e7ecf93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.838 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[00ace9f2-328c-42dd-beec-f91e5846e448]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 NetworkManager[56229]: <info>  [1765224879.8569] device (tapf378b9ae-f0): carrier: link connected
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.860 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[c791e503-de42-4b97-ba50-e5ada01054d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.879 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[6cadda6f-4b07-4694-bc58-2688c983902d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf378b9ae-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:1a:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341745, 'reachable_time': 34038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214757, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.899 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[982add7f-11f3-4917-967f-8e3cc69d3573]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:1afe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341745, 'tstamp': 341745}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214758, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.920 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f161766e-55e8-4e01-a3bd-4d2bc95f6d03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf378b9ae-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:1a:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341745, 'reachable_time': 34038, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214759, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:39 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:39.956 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[588c87f8-1805-41d8-b0c9-8cf46c28a11c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.021 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[41090a0d-25ba-47ae-b07b-2f7114b57264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.024 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf378b9ae-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.024 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.024 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf378b9ae-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.027 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 kernel: tapf378b9ae-f0: entered promiscuous mode
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.029 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 NetworkManager[56229]: <info>  [1765224880.0299] manager: (tapf378b9ae-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.033 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf378b9ae-f0, col_values=(('external_ids', {'iface-id': 'e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.034 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 ovn_controller[96170]: 2025-12-08T20:14:40Z|00043|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.034 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.035 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.036 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[52a7a676-90fc-4ea6-863c-7ed585270093]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.037 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.038 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'env', 'PROCESS_TAG=haproxy-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f378b9ae-fe6a-498a-b0ea-0d98aea69001.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.047 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 podman[214792]: 2025-12-08 20:14:40.409545342 +0000 UTC m=+0.068229350 container create d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:14:40 compute-0 systemd[1]: Started libpod-conmon-d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552.scope.
Dec 08 20:14:40 compute-0 podman[214792]: 2025-12-08 20:14:40.372983167 +0000 UTC m=+0.031667265 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:14:40 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:14:40 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/589ed7e28ecf9e04ca4cfd0159b4dba4a46e82d290a02dc79701646f72c30c5d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:14:40 compute-0 podman[214792]: 2025-12-08 20:14:40.483278302 +0000 UTC m=+0.141962330 container init d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 08 20:14:40 compute-0 podman[214792]: 2025-12-08 20:14:40.488424772 +0000 UTC m=+0.147108780 container start d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:14:40 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [NOTICE]   (214813) : New worker (214815) forked
Dec 08 20:14:40 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [NOTICE]   (214813) : Loading success.
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.541 105024 INFO neutron.agent.ovn.metadata.agent [-] Port f25bb8cc-9d43-433e-9e69-63da3f5c18ad in datapath 9b5e5772-d3b1-48b1-8423-229c5601e5b8 unbound from our chassis
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.543 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9b5e5772-d3b1-48b1-8423-229c5601e5b8
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.554 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[38d9fb38-aa6f-4df5-b752-005c02849d4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.555 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9b5e5772-d1 in ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.556 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9b5e5772-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.556 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[777fed44-3e65-4e6a-9f98-c800daaee4c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.557 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[299bf288-8b5c-47c5-b86c-ca53b407aff9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.568 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.582 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[b929cf4c-c07b-467c-86bc-6946fd249f90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.606 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0840977d-96db-47b2-9d38-adc442f2fcf3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.632 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f50782-f3d7-416b-85b5-165005f32a5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.638 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[47a4f879-9c99-47f6-9e67-03fe64ca84f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 NetworkManager[56229]: <info>  [1765224880.6394] manager: (tap9b5e5772-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Dec 08 20:14:40 compute-0 systemd-udevd[214745]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.665 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[cae1d5c5-7fe6-43d1-a5e0-18062568a1e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.668 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[5489f456-f36e-411c-bb43-78fdcddee7e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 NetworkManager[56229]: <info>  [1765224880.6868] device (tap9b5e5772-d0): carrier: link connected
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.690 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[a97ef3a4-0fdb-43c8-a880-ae213dc6d666]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.703 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[03cc10a5-f798-444a-a173-bd1e9f25e392]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b5e5772-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:f2:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341828, 'reachable_time': 39719, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214835, 'error': None, 'target': 'ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.716 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[a65a821c-a333-480c-92e7-124f86d87f77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe72:f2b5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341828, 'tstamp': 341828}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214837, 'error': None, 'target': 'ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.731 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[97811e21-3b24-4b38-a51d-bf3ae4718f6c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9b5e5772-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:72:f2:b5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341828, 'reachable_time': 39719, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214838, 'error': None, 'target': 'ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.758 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b5ef79-894f-43da-bd2d-2413760a01a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.815 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[71dd61e1-dfc8-4148-ba28-3e36c8618d21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.816 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b5e5772-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.817 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.817 187791 DEBUG nova.compute.manager [req-97942d2f-7ba6-4ab5-9f55-2b7e2e70eef8 req-eec1dc01-e8bb-49ac-aa64-bd9c3e054c77 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.817 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9b5e5772-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.817 187791 DEBUG oslo_concurrency.lockutils [req-97942d2f-7ba6-4ab5-9f55-2b7e2e70eef8 req-eec1dc01-e8bb-49ac-aa64-bd9c3e054c77 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.818 187791 DEBUG oslo_concurrency.lockutils [req-97942d2f-7ba6-4ab5-9f55-2b7e2e70eef8 req-eec1dc01-e8bb-49ac-aa64-bd9c3e054c77 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.818 187791 DEBUG oslo_concurrency.lockutils [req-97942d2f-7ba6-4ab5-9f55-2b7e2e70eef8 req-eec1dc01-e8bb-49ac-aa64-bd9c3e054c77 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.818 187791 DEBUG nova.compute.manager [req-97942d2f-7ba6-4ab5-9f55-2b7e2e70eef8 req-eec1dc01-e8bb-49ac-aa64-bd9c3e054c77 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Processing event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.819 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:14:40 compute-0 kernel: tap9b5e5772-d0: entered promiscuous mode
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.819 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 NetworkManager[56229]: <info>  [1765224880.8199] manager: (tap9b5e5772-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.821 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.823 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9b5e5772-d0, col_values=(('external_ids', {'iface-id': '8810d692-3563-4549-98b0-f9429e158843'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.824 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.832 187791 INFO nova.virt.libvirt.driver [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Instance spawned successfully.
Dec 08 20:14:40 compute-0 ovn_controller[96170]: 2025-12-08T20:14:40Z|00044|binding|INFO|Releasing lport 8810d692-3563-4549-98b0-f9429e158843 from this chassis (sb_readonly=0)
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.833 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224880.8320243, 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.833 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] VM Resumed (Lifecycle Event)
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.833 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9b5e5772-d3b1-48b1-8423-229c5601e5b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9b5e5772-d3b1-48b1-8423-229c5601e5b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.834 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.834 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[82dc4fa8-193c-4d96-92e6-df3a8ef3fe51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.835 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-9b5e5772-d3b1-48b1-8423-229c5601e5b8
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/9b5e5772-d3b1-48b1-8423-229c5601e5b8.pid.haproxy
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID 9b5e5772-d3b1-48b1-8423-229c5601e5b8
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.835 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:14:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:40.836 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'env', 'PROCESS_TAG=haproxy-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9b5e5772-d3b1-48b1-8423-229c5601e5b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.844 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.862 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.866 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.866 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.867 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.867 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.867 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.868 187791 DEBUG nova.virt.libvirt.driver [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.874 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.909 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.983 187791 INFO nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Took 12.13 seconds to spawn the instance on the hypervisor.
Dec 08 20:14:40 compute-0 nova_compute[187787]: 2025-12-08 20:14:40.984 187791 DEBUG nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:41 compute-0 nova_compute[187787]: 2025-12-08 20:14:41.071 187791 INFO nova.compute.manager [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Took 12.78 seconds to build instance.
Dec 08 20:14:41 compute-0 nova_compute[187787]: 2025-12-08 20:14:41.100 187791 DEBUG oslo_concurrency.lockutils [None req-52af91d1-9682-47a4-a94b-a2c1f40577a9 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:41 compute-0 podman[214870]: 2025-12-08 20:14:41.216281357 +0000 UTC m=+0.045536425 container create 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 08 20:14:41 compute-0 systemd[1]: Started libpod-conmon-8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f.scope.
Dec 08 20:14:41 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:14:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e563203fdf2b04dd4a2dcdcd5ea92cf38d629511e45263b741cf075cfddba2ae/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:14:41 compute-0 podman[214870]: 2025-12-08 20:14:41.191154146 +0000 UTC m=+0.020409214 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:14:41 compute-0 podman[214870]: 2025-12-08 20:14:41.290610396 +0000 UTC m=+0.119865474 container init 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:14:41 compute-0 podman[214870]: 2025-12-08 20:14:41.296342324 +0000 UTC m=+0.125597392 container start 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 08 20:14:41 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [NOTICE]   (214890) : New worker (214892) forked
Dec 08 20:14:41 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [NOTICE]   (214890) : Loading success.
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.353 105024 INFO neutron.agent.ovn.metadata.agent [-] Port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba in datapath 66073b60-2cee-4d92-b656-15d29787b3b5 unbound from our chassis
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.356 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 66073b60-2cee-4d92-b656-15d29787b3b5
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.366 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b601de9e-2478-4ebf-bfd7-2ed2779e1e34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.367 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap66073b60-21 in ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.369 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap66073b60-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.369 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3c8bd8-fd5c-4862-b42d-b5622bbaa096]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.370 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[5d16fb5d-78db-4b20-857e-d25ac33a7799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.392 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[eda22a35-eda9-4958-9fc2-015cb2650d4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.417 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b03eb10b-cd35-405d-8d92-64f2a98ca0e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.446 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[95f5f69d-4f7a-4757-aef6-78f1219dcd60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 NetworkManager[56229]: <info>  [1765224881.4544] manager: (tap66073b60-20): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.458 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[7b8523af-8a4a-43ca-ac7e-34923f34ef90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.494 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[feb50f54-f65a-452d-86d7-ef0af09231c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.497 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[0c4ac200-545b-4587-96c1-9bef49e2e30b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 NetworkManager[56229]: <info>  [1765224881.5198] device (tap66073b60-20): carrier: link connected
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.526 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[f368ff57-bfe6-4ee5-8ca3-afe6e2e0237b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.542 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[c66b8391-a244-4664-ace8-8c2b4a6e888f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66073b60-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:71:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341911, 'reachable_time': 27547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214913, 'error': None, 'target': 'ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.561 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[fb942504-c22a-4500-9527-3fcc6ebd69a7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe33:717c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 341911, 'tstamp': 341911}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214914, 'error': None, 'target': 'ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.581 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[95445cc0-f829-41b8-9be1-70bf6e7cf3bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap66073b60-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:33:71:7c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341911, 'reachable_time': 27547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214915, 'error': None, 'target': 'ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.614 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[26a47425-e6e2-4ffc-aaba-1df41da10d07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.687 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[ed38b5cb-4bfb-49a4-9eee-34d641a6fcb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.688 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66073b60-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.689 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.689 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap66073b60-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:41 compute-0 kernel: tap66073b60-20: entered promiscuous mode
Dec 08 20:14:41 compute-0 nova_compute[187787]: 2025-12-08 20:14:41.691 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:41 compute-0 NetworkManager[56229]: <info>  [1765224881.6922] manager: (tap66073b60-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.695 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap66073b60-20, col_values=(('external_ids', {'iface-id': '0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:41 compute-0 nova_compute[187787]: 2025-12-08 20:14:41.696 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:41 compute-0 ovn_controller[96170]: 2025-12-08T20:14:41Z|00045|binding|INFO|Releasing lport 0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b from this chassis (sb_readonly=0)
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.697 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/66073b60-2cee-4d92-b656-15d29787b3b5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/66073b60-2cee-4d92-b656-15d29787b3b5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.707 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[99a41fcb-765e-454b-a74c-27f2e73d4c19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.708 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-66073b60-2cee-4d92-b656-15d29787b3b5
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/66073b60-2cee-4d92-b656-15d29787b3b5.pid.haproxy
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID 66073b60-2cee-4d92-b656-15d29787b3b5
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:14:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:41.709 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5', 'env', 'PROCESS_TAG=haproxy-66073b60-2cee-4d92-b656-15d29787b3b5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/66073b60-2cee-4d92-b656-15d29787b3b5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:14:41 compute-0 nova_compute[187787]: 2025-12-08 20:14:41.708 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.068 187791 DEBUG nova.compute.manager [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.069 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.069 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.070 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.070 187791 DEBUG nova.compute.manager [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Processing event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.070 187791 DEBUG nova.compute.manager [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.071 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.071 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.071 187791 DEBUG oslo_concurrency.lockutils [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.072 187791 DEBUG nova.compute.manager [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] No waiting events found dispatching network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.072 187791 WARNING nova.compute.manager [req-303f20f8-dfbd-4277-866c-9218a615ecc9 req-0b306ff9-824e-43a8-84c4-8f4949837a44 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received unexpected event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba for instance with vm_state building and task_state spawning.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.073 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.079 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.085 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224882.085132, 9f228c07-c6ac-479c-9edb-ceebc19eac87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.086 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] VM Resumed (Lifecycle Event)
Dec 08 20:14:42 compute-0 podman[214947]: 2025-12-08 20:14:42.091582102 +0000 UTC m=+0.073972419 container create 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.092 187791 INFO nova.virt.libvirt.driver [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance spawned successfully.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.093 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:14:42 compute-0 systemd[1]: Started libpod-conmon-4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911.scope.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.133 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.137 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:42 compute-0 podman[214947]: 2025-12-08 20:14:42.045624074 +0000 UTC m=+0.028014411 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.146 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.147 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.148 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.149 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.149 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.150 187791 DEBUG nova.virt.libvirt.driver [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:14:42 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c429656d1c4f4b073eac50c2d6d9fbfd370216964d3fdfbe354849a539a6166/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:14:42 compute-0 podman[214947]: 2025-12-08 20:14:42.172905948 +0000 UTC m=+0.155296285 container init 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 08 20:14:42 compute-0 podman[214947]: 2025-12-08 20:14:42.179944976 +0000 UTC m=+0.162335293 container start 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 08 20:14:42 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [NOTICE]   (214966) : New worker (214968) forked
Dec 08 20:14:42 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [NOTICE]   (214966) : Loading success.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.203 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:42 compute-0 sshd-session[214805]: Received disconnect from 103.172.28.62 port 58434:11: Bye Bye [preauth]
Dec 08 20:14:42 compute-0 sshd-session[214805]: Disconnected from authenticating user root 103.172.28.62 port 58434 [preauth]
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.242 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 15884d80-a050-45e4-a91f-aa0953fde76b in datapath d362f29d-3769-41ec-9071-c5989fa7f4f1 unbound from our chassis
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.246 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d362f29d-3769-41ec-9071-c5989fa7f4f1
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.259 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[27aa74be-955f-4a47-88db-fd297f601456]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.260 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd362f29d-31 in ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.262 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd362f29d-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.262 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[6f0ba3f6-32e2-4537-869c-cff471f25c03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.264 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[d7fb152b-acf4-43da-9c41-8a5455eea3f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.269 187791 INFO nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Took 14.90 seconds to spawn the instance on the hypervisor.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.270 187791 DEBUG nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.288 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[0e14ff30-5634-44f1-9df2-0ca9cadf18f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.302 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b74807d1-3db9-409a-9290-f378e6e2aa19]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.332 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac68648-abc1-48da-acad-02bddab2b799]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 NetworkManager[56229]: <info>  [1765224882.3466] manager: (tapd362f29d-30): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.351 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[a7cb25bf-5b95-47b3-b59a-9ca79932fc8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 systemd-udevd[214985]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.384 187791 INFO nova.compute.manager [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Took 16.22 seconds to build instance.
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.392 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[348a171c-ddda-4131-baa1-4c39418f122b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.396 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[495fddeb-cfd5-4140-ab82-0d7e506a42ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 NetworkManager[56229]: <info>  [1765224882.4209] device (tapd362f29d-30): carrier: link connected
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.425 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[df2cb6ad-8e81-4461-b52c-94082596fca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.425 187791 DEBUG oslo_concurrency.lockutils [None req-7c1ca3de-56b2-43ac-9e32-2592f0c66135 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.371s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.441 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.442 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[5689b330-7d90-4bbf-921d-bd80069fdc7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd362f29d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:45:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342001, 'reachable_time': 37724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215004, 'error': None, 'target': 'ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.471 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb7914c-d523-46e7-8d9d-77e93b87be03]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:45c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 342001, 'tstamp': 342001}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215005, 'error': None, 'target': 'ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.487 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[41ec79a8-f1a9-4c0f-8ab7-72cf51fd7d1a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd362f29d-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:45:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 342001, 'reachable_time': 37724, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215006, 'error': None, 'target': 'ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.528 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[745975ee-feda-4fbf-a53c-2af9e58be671]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.591 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[36f2c2f3-c36f-4169-a5dc-dce90d4c3851]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.592 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd362f29d-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.593 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.593 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd362f29d-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.595 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:42 compute-0 NetworkManager[56229]: <info>  [1765224882.5957] manager: (tapd362f29d-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Dec 08 20:14:42 compute-0 kernel: tapd362f29d-30: entered promiscuous mode
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.597 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd362f29d-30, col_values=(('external_ids', {'iface-id': 'f0094d9b-0146-4554-97c5-906c6477650e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.598 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:42 compute-0 ovn_controller[96170]: 2025-12-08T20:14:42Z|00046|binding|INFO|Releasing lport f0094d9b-0146-4554-97c5-906c6477650e from this chassis (sb_readonly=0)
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.612 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.613 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d362f29d-3769-41ec-9071-c5989fa7f4f1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d362f29d-3769-41ec-9071-c5989fa7f4f1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.614 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[43a53e9a-f2d0-446a-aabc-9571b832af37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.615 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-d362f29d-3769-41ec-9071-c5989fa7f4f1
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/d362f29d-3769-41ec-9071-c5989fa7f4f1.pid.haproxy
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID d362f29d-3769-41ec-9071-c5989fa7f4f1
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:14:42 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:42.616 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'env', 'PROCESS_TAG=haproxy-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d362f29d-3769-41ec-9071-c5989fa7f4f1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.650 187791 DEBUG nova.network.neutron [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updated VIF entry in instance network info cache for port 15884d80-a050-45e4-a91f-aa0953fde76b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.650 187791 DEBUG nova.network.neutron [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updating instance_info_cache with network_info: [{"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.685 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.685 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.685 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.686 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.686 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.686 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Processing event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.686 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.686 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] No waiting events found dispatching network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 WARNING nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received unexpected event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 for instance with vm_state building and task_state spawning.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.687 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Processing event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.688 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.689 187791 DEBUG oslo_concurrency.lockutils [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.689 187791 DEBUG nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] No waiting events found dispatching network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.689 187791 WARNING nova.compute.manager [req-26990b3c-2139-4f5f-b9b7-dbf5344ab9c3 req-b154cd00-0f9d-42b8-b695-f9e4c4cd0c21 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received unexpected event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad for instance with vm_state building and task_state spawning.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.690 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.690 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Instance event wait completed in 6 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.694 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224882.6943052, 31b38d28-b90e-434c-9967-912987aee08b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.694 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Resumed (Lifecycle Event)
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.696 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.697 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.701 187791 INFO nova.virt.libvirt.driver [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance spawned successfully.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.701 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.716 187791 INFO nova.virt.libvirt.driver [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Instance spawned successfully.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.717 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.725 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.728 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.742 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.742 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.743 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.744 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.745 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.745 187791 DEBUG nova.virt.libvirt.driver [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.754 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.754 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.754 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.755 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.755 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.755 187791 DEBUG nova.virt.libvirt.driver [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.758 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.758 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224882.6960285, 9deed673-fc96-4e81-b9ba-c3f0e83e1625 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.759 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] VM Resumed (Lifecycle Event)
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.841 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.845 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.872 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.877 187791 INFO nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Took 19.10 seconds to spawn the instance on the hypervisor.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.878 187791 DEBUG nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.882 187791 INFO nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Took 17.92 seconds to spawn the instance on the hypervisor.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.882 187791 DEBUG nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.980 187791 INFO nova.compute.manager [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Took 19.82 seconds to build instance.
Dec 08 20:14:42 compute-0 nova_compute[187787]: 2025-12-08 20:14:42.983 187791 INFO nova.compute.manager [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Took 18.61 seconds to build instance.
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.012 187791 DEBUG oslo_concurrency.lockutils [None req-861c2997-0e42-4845-b5c1-a4abe99f4524 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.014 187791 DEBUG oslo_concurrency.lockutils [None req-d473d361-1a8e-4cef-bb6c-391871bc6a91 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:43 compute-0 podman[215036]: 2025-12-08 20:14:43.043076453 +0000 UTC m=+0.072594746 container create 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:14:43 compute-0 systemd[1]: Started libpod-conmon-25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f.scope.
Dec 08 20:14:43 compute-0 podman[215036]: 2025-12-08 20:14:43.003954548 +0000 UTC m=+0.033472871 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:14:43 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:14:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5cf10a495bb0ccf5ed5c37b5a29ebc28005b8f86086e19fe148d4f9c7c4f1355/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:14:43 compute-0 podman[215036]: 2025-12-08 20:14:43.128078602 +0000 UTC m=+0.157596905 container init 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 08 20:14:43 compute-0 podman[215036]: 2025-12-08 20:14:43.134285345 +0000 UTC m=+0.163803638 container start 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:14:43 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [NOTICE]   (215055) : New worker (215057) forked
Dec 08 20:14:43 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [NOTICE]   (215055) : Loading success.
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.341 187791 DEBUG nova.compute.manager [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.342 187791 DEBUG oslo_concurrency.lockutils [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.342 187791 DEBUG oslo_concurrency.lockutils [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.343 187791 DEBUG oslo_concurrency.lockutils [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.343 187791 DEBUG nova.compute.manager [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] No waiting events found dispatching network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:43 compute-0 nova_compute[187787]: 2025-12-08 20:14:43.343 187791 WARNING nova.compute.manager [req-af7b1b3a-5a03-420d-8c57-5f3aa73f5243 req-3b720552-9e35-429b-8114-2bbaecdb546a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received unexpected event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b for instance with vm_state active and task_state None.
Dec 08 20:14:44 compute-0 podman[215067]: 2025-12-08 20:14:44.753462312 +0000 UTC m=+0.070093188 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 08 20:14:44 compute-0 podman[215066]: 2025-12-08 20:14:44.789613955 +0000 UTC m=+0.113336261 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 08 20:14:45 compute-0 nova_compute[187787]: 2025-12-08 20:14:45.570 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:47 compute-0 nova_compute[187787]: 2025-12-08 20:14:47.443 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:48 compute-0 podman[215116]: 2025-12-08 20:14:47.999920038 +0000 UTC m=+0.065260897 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6144] manager: (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/35)
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6151] device (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <warn>  [1765224888.6153] device (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6165] manager: (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/36)
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6170] device (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <warn>  [1765224888.6170] device (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6181] manager: (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6190] manager: (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6197] device (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 08 20:14:48 compute-0 NetworkManager[56229]: <info>  [1765224888.6202] device (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 08 20:14:48 compute-0 nova_compute[187787]: 2025-12-08 20:14:48.622 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:48 compute-0 nova_compute[187787]: 2025-12-08 20:14:48.693 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:48 compute-0 ovn_controller[96170]: 2025-12-08T20:14:48Z|00047|binding|INFO|Releasing lport 0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b from this chassis (sb_readonly=0)
Dec 08 20:14:48 compute-0 ovn_controller[96170]: 2025-12-08T20:14:48Z|00048|binding|INFO|Releasing lport 8810d692-3563-4549-98b0-f9429e158843 from this chassis (sb_readonly=0)
Dec 08 20:14:48 compute-0 ovn_controller[96170]: 2025-12-08T20:14:48Z|00049|binding|INFO|Releasing lport f0094d9b-0146-4554-97c5-906c6477650e from this chassis (sb_readonly=0)
Dec 08 20:14:48 compute-0 ovn_controller[96170]: 2025-12-08T20:14:48Z|00050|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:14:48 compute-0 nova_compute[187787]: 2025-12-08 20:14:48.718 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:49 compute-0 sshd-session[215142]: Invalid user oo from 47.76.127.165 port 56942
Dec 08 20:14:50 compute-0 sshd-session[215142]: Received disconnect from 47.76.127.165 port 56942:11: Bye Bye [preauth]
Dec 08 20:14:50 compute-0 sshd-session[215142]: Disconnected from invalid user oo 47.76.127.165 port 56942 [preauth]
Dec 08 20:14:50 compute-0 nova_compute[187787]: 2025-12-08 20:14:50.572 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.320 187791 DEBUG nova.compute.manager [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-changed-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.321 187791 DEBUG nova.compute.manager [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Refreshing instance network info cache due to event network-changed-15884d80-a050-45e4-a91f-aa0953fde76b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.321 187791 DEBUG oslo_concurrency.lockutils [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.321 187791 DEBUG oslo_concurrency.lockutils [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.321 187791 DEBUG nova.network.neutron [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Refreshing network info cache for port 15884d80-a050-45e4-a91f-aa0953fde76b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.476 187791 DEBUG nova.compute.manager [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.477 187791 DEBUG nova.compute.manager [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing instance network info cache due to event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.477 187791 DEBUG oslo_concurrency.lockutils [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.478 187791 DEBUG oslo_concurrency.lockutils [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.478 187791 DEBUG nova.network.neutron [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:51 compute-0 nova_compute[187787]: 2025-12-08 20:14:51.987 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:51.988 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:51.990 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.070 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.071 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.073 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.075 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.445 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.549 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.550 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.550 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.550 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.551 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.552 187791 INFO nova.compute.manager [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Terminating instance
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.553 187791 DEBUG nova.compute.manager [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:14:52 compute-0 kernel: tap15884d80-a0 (unregistering): left promiscuous mode
Dec 08 20:14:52 compute-0 NetworkManager[56229]: <info>  [1765224892.5891] device (tap15884d80-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:14:52 compute-0 ovn_controller[96170]: 2025-12-08T20:14:52Z|00051|binding|INFO|Releasing lport 15884d80-a050-45e4-a91f-aa0953fde76b from this chassis (sb_readonly=0)
Dec 08 20:14:52 compute-0 ovn_controller[96170]: 2025-12-08T20:14:52Z|00052|binding|INFO|Setting lport 15884d80-a050-45e4-a91f-aa0953fde76b down in Southbound
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.597 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 ovn_controller[96170]: 2025-12-08T20:14:52Z|00053|binding|INFO|Removing iface tap15884d80-a0 ovn-installed in OVS
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.615 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:2d:37 10.100.0.5'], port_security=['fa:16:3e:d7:2d:37 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1e4936a4-4e9a-45e3-9bdb-bd423abc6045', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c09b5e1948374012b56a6b174f13203a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8587a849-6ae8-41e1-b943-a03fa78a52c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.230'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41fe0558-284a-4ff7-a89b-c230752e9d97, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=15884d80-a050-45e4-a91f-aa0953fde76b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.617 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 15884d80-a050-45e4-a91f-aa0953fde76b in datapath d362f29d-3769-41ec-9071-c5989fa7f4f1 unbound from our chassis
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.620 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d362f29d-3769-41ec-9071-c5989fa7f4f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.623 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.623 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4cdc71cf-2879-4e06-8108-e3ad6038deee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:52.624 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1 namespace which is not needed anymore
Dec 08 20:14:52 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Dec 08 20:14:52 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 11.720s CPU time.
Dec 08 20:14:52 compute-0 systemd-machined[154122]: Machine qemu-4-instance-00000004 terminated.
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.808 187791 INFO nova.virt.libvirt.driver [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Instance destroyed successfully.
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.808 187791 DEBUG nova.objects.instance [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lazy-loading 'resources' on Instance uuid 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.823 187791 DEBUG nova.virt.libvirt.vif [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestManualDisk-server-1919705409',display_name='tempest-ServersTestManualDisk-server-1919705409',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestmanualdisk-server-1919705409',id=4,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMy7CWmi+A6syNCS8/WYtOKv7Vux4U24DOqxdQ+/7mPhbR66uGjy0QZAvUEJ1ZAAj6PhlIZqclMmdcMZZSllbJ5nImidjRemuKe7PQUnZX4pp7rsSfIX0hlGi1ppXkAlPg==',key_name='tempest-keypair-786640513',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:40Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c09b5e1948374012b56a6b174f13203a',ramdisk_id='',reservation_id='r-inpxfaxn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestManualDisk-1603475250',owner_user_name='tempest-ServersTestManualDisk-1603475250-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:14:41Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='182a1bf5731c443d90e215465b085637',uuid=1e4936a4-4e9a-45e3-9bdb-bd423abc6045,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.823 187791 DEBUG nova.network.os_vif_util [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converting VIF {"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.824 187791 DEBUG nova.network.os_vif_util [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.824 187791 DEBUG os_vif [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.826 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.826 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15884d80-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.827 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.828 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.830 187791 INFO os_vif [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:2d:37,bridge_name='br-int',has_traffic_filtering=True,id=15884d80-a050-45e4-a91f-aa0953fde76b,network=Network(d362f29d-3769-41ec-9071-c5989fa7f4f1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap15884d80-a0')
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.831 187791 INFO nova.virt.libvirt.driver [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Deleting instance files /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045_del
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.831 187791 INFO nova.virt.libvirt.driver [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Deletion of /var/lib/nova/instances/1e4936a4-4e9a-45e3-9bdb-bd423abc6045_del complete
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.913 187791 DEBUG nova.virt.libvirt.host [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.914 187791 INFO nova.virt.libvirt.host [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] UEFI support detected
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.915 187791 INFO nova.compute.manager [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.916 187791 DEBUG oslo.service.loopingcall [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.916 187791 DEBUG nova.compute.manager [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:14:52 compute-0 nova_compute[187787]: 2025-12-08 20:14:52.916 187791 DEBUG nova.network.neutron [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [NOTICE]   (215055) : haproxy version is 2.8.14-c23fe91
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [NOTICE]   (215055) : path to executable is /usr/sbin/haproxy
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [WARNING]  (215055) : Exiting Master process...
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [WARNING]  (215055) : Exiting Master process...
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [ALERT]    (215055) : Current worker (215057) exited with code 143 (Terminated)
Dec 08 20:14:52 compute-0 neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1[215051]: [WARNING]  (215055) : All workers exited. Exiting... (0)
Dec 08 20:14:52 compute-0 systemd[1]: libpod-25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f.scope: Deactivated successfully.
Dec 08 20:14:52 compute-0 podman[215174]: 2025-12-08 20:14:52.930290016 +0000 UTC m=+0.180640955 container died 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:14:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f-userdata-shm.mount: Deactivated successfully.
Dec 08 20:14:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-5cf10a495bb0ccf5ed5c37b5a29ebc28005b8f86086e19fe148d4f9c7c4f1355-merged.mount: Deactivated successfully.
Dec 08 20:14:52 compute-0 podman[215174]: 2025-12-08 20:14:52.981273509 +0000 UTC m=+0.231624458 container cleanup 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:14:52 compute-0 systemd[1]: libpod-conmon-25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f.scope: Deactivated successfully.
Dec 08 20:14:53 compute-0 podman[215219]: 2025-12-08 20:14:53.077428094 +0000 UTC m=+0.074536351 container remove 25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.091 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[d97b7082-103a-41f1-b24e-6f8220d73ee4]: (4, ('Mon Dec  8 08:14:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1 (25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f)\n25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f\nMon Dec  8 08:14:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1 (25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f)\n25820b0cc71039ff6df9d6e9e9021f4c64f910c8d922b9780617a62dba4a4a1f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.092 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fcebd8-ebc0-4fe3-9972-f73a11c16560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.094 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd362f29d-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.095 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:53 compute-0 kernel: tapd362f29d-30: left promiscuous mode
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.097 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.103 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f684ae00-ac15-4517-b24c-d07f68c7530a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.113 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.127 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[eb3b8062-7bf5-426a-9738-c920c43124f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.128 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6ffd2a-dff9-464c-a80b-4a3012066875]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.145 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[07df6d47-c70e-4cdd-a371-6d235a394866]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341992, 'reachable_time': 37317, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215232, 'error': None, 'target': 'ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 systemd[1]: run-netns-ovnmeta\x2dd362f29d\x2d3769\x2d41ec\x2d9071\x2dc5989fa7f4f1.mount: Deactivated successfully.
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.155 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d362f29d-3769-41ec-9071-c5989fa7f4f1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:14:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:53.155 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[2003c6c2-c7f3-4caf-8b3f-ba4d1726cee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.287 187791 DEBUG nova.network.neutron [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updated VIF entry in instance network info cache for port 15884d80-a050-45e4-a91f-aa0953fde76b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.287 187791 DEBUG nova.network.neutron [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updating instance_info_cache with network_info: [{"id": "15884d80-a050-45e4-a91f-aa0953fde76b", "address": "fa:16:3e:d7:2d:37", "network": {"id": "d362f29d-3769-41ec-9071-c5989fa7f4f1", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-1366012101-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.230", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c09b5e1948374012b56a6b174f13203a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap15884d80-a0", "ovs_interfaceid": "15884d80-a050-45e4-a91f-aa0953fde76b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.326 187791 DEBUG oslo_concurrency.lockutils [req-7e49649f-5984-40cc-8d0a-3eefaac4606b req-1a0d1a09-2343-4340-bc26-902772650db5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-1e4936a4-4e9a-45e3-9bdb-bd423abc6045" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.669 187791 DEBUG nova.compute.manager [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-changed-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.670 187791 DEBUG nova.compute.manager [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Refreshing instance network info cache due to event network-changed-488a5725-c797-4165-b8ce-319c48f2e8b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.670 187791 DEBUG oslo_concurrency.lockutils [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.670 187791 DEBUG oslo_concurrency.lockutils [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.670 187791 DEBUG nova.network.neutron [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Refreshing network info cache for port 488a5725-c797-4165-b8ce-319c48f2e8b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.974 187791 DEBUG nova.compute.manager [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-changed-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.974 187791 DEBUG nova.compute.manager [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Refreshing instance network info cache due to event network-changed-f25bb8cc-9d43-433e-9e69-63da3f5c18ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.975 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.975 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:14:53 compute-0 nova_compute[187787]: 2025-12-08 20:14:53.975 187791 DEBUG nova.network.neutron [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Refreshing network info cache for port f25bb8cc-9d43-433e-9e69-63da3f5c18ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.344 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.344 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.344 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.345 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.345 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.346 187791 INFO nova.compute.manager [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Terminating instance
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.347 187791 DEBUG nova.compute.manager [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:14:54 compute-0 kernel: tapf25bb8cc-9d (unregistering): left promiscuous mode
Dec 08 20:14:54 compute-0 NetworkManager[56229]: <info>  [1765224894.4158] device (tapf25bb8cc-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.420 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 ovn_controller[96170]: 2025-12-08T20:14:54Z|00054|binding|INFO|Releasing lport f25bb8cc-9d43-433e-9e69-63da3f5c18ad from this chassis (sb_readonly=0)
Dec 08 20:14:54 compute-0 ovn_controller[96170]: 2025-12-08T20:14:54Z|00055|binding|INFO|Setting lport f25bb8cc-9d43-433e-9e69-63da3f5c18ad down in Southbound
Dec 08 20:14:54 compute-0 ovn_controller[96170]: 2025-12-08T20:14:54Z|00056|binding|INFO|Removing iface tapf25bb8cc-9d ovn-installed in OVS
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.439 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.442 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.446 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:36 10.100.0.9'], port_security=['fa:16:3e:02:16:36 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '9deed673-fc96-4e81-b9ba-c3f0e83e1625', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '71532308007a48d5aef697fbd39501f6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '15728c5a-0648-40f0-b261-fafccf1978da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.179'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27a9d363-1573-4efe-9e17-2aefd8a246e5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=f25bb8cc-9d43-433e-9e69-63da3f5c18ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.447 105024 INFO neutron.agent.ovn.metadata.agent [-] Port f25bb8cc-9d43-433e-9e69-63da3f5c18ad in datapath 9b5e5772-d3b1-48b1-8423-229c5601e5b8 unbound from our chassis
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.449 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b5e5772-d3b1-48b1-8423-229c5601e5b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.450 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c11594-029b-4712-bc09-476f653e8004]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.451 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8 namespace which is not needed anymore
Dec 08 20:14:54 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 08 20:14:54 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 11.909s CPU time.
Dec 08 20:14:54 compute-0 systemd-machined[154122]: Machine qemu-2-instance-00000002 terminated.
Dec 08 20:14:54 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [NOTICE]   (214890) : haproxy version is 2.8.14-c23fe91
Dec 08 20:14:54 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [NOTICE]   (214890) : path to executable is /usr/sbin/haproxy
Dec 08 20:14:54 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [WARNING]  (214890) : Exiting Master process...
Dec 08 20:14:54 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [ALERT]    (214890) : Current worker (214892) exited with code 143 (Terminated)
Dec 08 20:14:54 compute-0 neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8[214886]: [WARNING]  (214890) : All workers exited. Exiting... (0)
Dec 08 20:14:54 compute-0 systemd[1]: libpod-8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f.scope: Deactivated successfully.
Dec 08 20:14:54 compute-0 conmon[214886]: conmon 8f49a5ae042487c75b56 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f.scope/container/memory.events
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.603 187791 INFO nova.virt.libvirt.driver [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Instance destroyed successfully.
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.603 187791 DEBUG nova.objects.instance [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lazy-loading 'resources' on Instance uuid 9deed673-fc96-4e81-b9ba-c3f0e83e1625 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:14:54 compute-0 podman[215297]: 2025-12-08 20:14:54.606131068 +0000 UTC m=+0.052984286 container died 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.626 187791 DEBUG nova.virt.libvirt.vif [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=::babe:dc0c:1602,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestJSON-server-853585639',display_name='tempest-ServersTestJSON-server-853585639',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serverstestjson-server-853585639',id=2,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCaUFVSmdjdFM6Nr7JK2ENZ5QSAJNbmdFQI+NWSK4PwXWFNq/zuvL0KeEeiTQrSzUIFjT/xX/I392m1qXIMa6vFo4a9mq+fgSkfVNn+Pzisv0GZRCLUidzu4kPnQkU1Zyg==',key_name='tempest-keypair-321994246',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={hello='world'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='71532308007a48d5aef697fbd39501f6',ramdisk_id='',reservation_id='r-s3d9t0l6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestJSON-27859848',owner_user_name='tempest-ServersTestJSON-27859848-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:14:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='5c734374332f4a18956eedb746b128bf',uuid=9deed673-fc96-4e81-b9ba-c3f0e83e1625,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.627 187791 DEBUG nova.network.os_vif_util [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converting VIF {"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.628 187791 DEBUG nova.network.os_vif_util [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.629 187791 DEBUG os_vif [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.630 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.631 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf25bb8cc-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f-userdata-shm.mount: Deactivated successfully.
Dec 08 20:14:54 compute-0 systemd[1]: var-lib-containers-storage-overlay-e563203fdf2b04dd4a2dcdcd5ea92cf38d629511e45263b741cf075cfddba2ae-merged.mount: Deactivated successfully.
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.667 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.670 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.673 187791 INFO os_vif [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:36,bridge_name='br-int',has_traffic_filtering=True,id=f25bb8cc-9d43-433e-9e69-63da3f5c18ad,network=Network(9b5e5772-d3b1-48b1-8423-229c5601e5b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf25bb8cc-9d')
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.674 187791 INFO nova.virt.libvirt.driver [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Deleting instance files /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625_del
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.675 187791 INFO nova.virt.libvirt.driver [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Deletion of /var/lib/nova/instances/9deed673-fc96-4e81-b9ba-c3f0e83e1625_del complete
Dec 08 20:14:54 compute-0 podman[215297]: 2025-12-08 20:14:54.677250241 +0000 UTC m=+0.124103459 container cleanup 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 08 20:14:54 compute-0 systemd[1]: libpod-conmon-8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f.scope: Deactivated successfully.
Dec 08 20:14:54 compute-0 podman[215341]: 2025-12-08 20:14:54.753869725 +0000 UTC m=+0.047041680 container remove 8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.758 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[6fbc9bbc-2b9a-4474-9e8c-8987dc073ab2]: (4, ('Mon Dec  8 08:14:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8 (8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f)\n8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f\nMon Dec  8 08:14:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8 (8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f)\n8f49a5ae042487c75b5682f8199653769927c3e5853f7c0b9d660be3192dc57f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.759 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[bf9ec66d-0cfd-414e-a53a-18acae9b2ad6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.760 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9b5e5772-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.762 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 kernel: tap9b5e5772-d0: left promiscuous mode
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.764 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.766 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2fa1e958-21c8-43d5-9fc5-66be460d3705]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.778 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.786 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[5f228945-e854-4b93-987e-78df15ab29a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.787 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[36095487-13a2-419a-8d77-e9929d12a19a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.802 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[ffdb3f19-2aa7-44ad-9d5c-a5b92515b9d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341822, 'reachable_time': 42144, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215356, 'error': None, 'target': 'ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.804 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9b5e5772-d3b1-48b1-8423-229c5601e5b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.804 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[a9bd6c80-8951-49d0-b2dc-77211acdbf23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:14:54 compute-0 systemd[1]: run-netns-ovnmeta\x2d9b5e5772\x2dd3b1\x2d48b1\x2d8423\x2d229c5601e5b8.mount: Deactivated successfully.
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.930 187791 INFO nova.compute.manager [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Took 0.58 seconds to destroy the instance on the hypervisor.
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.931 187791 DEBUG oslo.service.loopingcall [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.931 187791 DEBUG nova.compute.manager [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:14:54 compute-0 nova_compute[187787]: 2025-12-08 20:14:54.932 187791 DEBUG nova.network.neutron [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.988 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.989 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:14:54.989 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:55 compute-0 ovn_controller[96170]: 2025-12-08T20:14:55Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d3:0b:70 10.100.0.10
Dec 08 20:14:55 compute-0 ovn_controller[96170]: 2025-12-08T20:14:55Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d3:0b:70 10.100.0.10
Dec 08 20:14:55 compute-0 nova_compute[187787]: 2025-12-08 20:14:55.574 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:55 compute-0 ovn_controller[96170]: 2025-12-08T20:14:55Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6c:cd:5c 10.100.0.4
Dec 08 20:14:55 compute-0 ovn_controller[96170]: 2025-12-08T20:14:55Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:cd:5c 10.100.0.4
Dec 08 20:14:55 compute-0 nova_compute[187787]: 2025-12-08 20:14:55.908 187791 DEBUG nova.network.neutron [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:55 compute-0 nova_compute[187787]: 2025-12-08 20:14:55.949 187791 INFO nova.compute.manager [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Took 3.03 seconds to deallocate network for instance.
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.015 187791 DEBUG nova.compute.manager [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-unplugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.016 187791 DEBUG oslo_concurrency.lockutils [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.016 187791 DEBUG oslo_concurrency.lockutils [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.017 187791 DEBUG oslo_concurrency.lockutils [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.017 187791 DEBUG nova.compute.manager [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] No waiting events found dispatching network-vif-unplugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.017 187791 DEBUG nova.compute.manager [req-3387c170-def4-42fe-b206-e9e89555f73d req-c38de340-fda1-4952-b665-58f048e84f57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-unplugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.019 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.020 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.028 187791 DEBUG nova.network.neutron [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updated VIF entry in instance network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.029 187791 DEBUG nova.network.neutron [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.059 187791 DEBUG oslo_concurrency.lockutils [req-bf7a326c-9386-4578-9a7a-6eba1e0d6117 req-662de28b-efef-4b92-8ca1-ce2e2fe0c586 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.082 187791 DEBUG nova.compute.manager [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.083 187791 DEBUG oslo_concurrency.lockutils [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.084 187791 DEBUG oslo_concurrency.lockutils [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.084 187791 DEBUG oslo_concurrency.lockutils [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.084 187791 DEBUG nova.compute.manager [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] No waiting events found dispatching network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.085 187791 WARNING nova.compute.manager [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received unexpected event network-vif-plugged-15884d80-a050-45e4-a91f-aa0953fde76b for instance with vm_state deleted and task_state None.
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.085 187791 DEBUG nova.compute.manager [req-2bfa4a4c-bece-4bcb-9094-7ca322582952 req-bb5a800c-ad56-4efd-be21-be01e3afe505 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-deleted-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.252 187791 DEBUG nova.compute.provider_tree [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.279 187791 DEBUG nova.scheduler.client.report [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.308 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.288s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.350 187791 INFO nova.scheduler.client.report [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Deleted allocations for instance 1e4936a4-4e9a-45e3-9bdb-bd423abc6045
Dec 08 20:14:56 compute-0 nova_compute[187787]: 2025-12-08 20:14:56.415 187791 DEBUG oslo_concurrency.lockutils [None req-55c100bd-359b-4a05-a459-94b87e6aced1 182a1bf5731c443d90e215465b085637 c09b5e1948374012b56a6b174f13203a - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.865s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:57 compute-0 podman[215357]: 2025-12-08 20:14:57.492923389 +0000 UTC m=+0.059565212 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm)
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.529 187791 DEBUG nova.network.neutron [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updated VIF entry in instance network info cache for port f25bb8cc-9d43-433e-9e69-63da3f5c18ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.530 187791 DEBUG nova.network.neutron [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updating instance_info_cache with network_info: [{"id": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "address": "fa:16:3e:02:16:36", "network": {"id": "9b5e5772-d3b1-48b1-8423-229c5601e5b8", "bridge": "br-int", "label": "tempest-ServersTestJSON-667285295-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.179", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "71532308007a48d5aef697fbd39501f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf25bb8cc-9d", "ovs_interfaceid": "f25bb8cc-9d43-433e-9e69-63da3f5c18ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.558 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9deed673-fc96-4e81-b9ba-c3f0e83e1625" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.559 187791 DEBUG nova.compute.manager [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-unplugged-15884d80-a050-45e4-a91f-aa0953fde76b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.559 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.559 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.559 187791 DEBUG oslo_concurrency.lockutils [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "1e4936a4-4e9a-45e3-9bdb-bd423abc6045-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.560 187791 DEBUG nova.compute.manager [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] No waiting events found dispatching network-vif-unplugged-15884d80-a050-45e4-a91f-aa0953fde76b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:57 compute-0 nova_compute[187787]: 2025-12-08 20:14:57.560 187791 DEBUG nova.compute.manager [req-8fc06619-903b-4b71-b647-e6c07e9b152d req-3336ae08-ed04-4338-813b-f9da30354fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Received event network-vif-unplugged-15884d80-a050-45e4-a91f-aa0953fde76b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.332 187791 DEBUG nova.network.neutron [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updated VIF entry in instance network info cache for port 488a5725-c797-4165-b8ce-319c48f2e8b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.333 187791 DEBUG nova.network.neutron [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.362 187791 DEBUG oslo_concurrency.lockutils [req-74e305a9-a15a-4732-8eaa-4bc54120d784 req-f6034d4e-db2b-41d2-8eb0-5fae7510d110 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.404 187791 DEBUG nova.compute.manager [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.404 187791 DEBUG oslo_concurrency.lockutils [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.405 187791 DEBUG oslo_concurrency.lockutils [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.405 187791 DEBUG oslo_concurrency.lockutils [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.407 187791 DEBUG nova.compute.manager [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] No waiting events found dispatching network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.407 187791 WARNING nova.compute.manager [req-f4777e5f-f3c4-4e68-8ccd-0e805d7f1107 req-13a693d8-fafe-4774-bc57-d3f02bd12a10 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received unexpected event network-vif-plugged-f25bb8cc-9d43-433e-9e69-63da3f5c18ad for instance with vm_state active and task_state deleting.
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.556 187791 DEBUG nova.network.neutron [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.577 187791 INFO nova.compute.manager [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Took 3.65 seconds to deallocate network for instance.
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.621 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.622 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.740 187791 DEBUG nova.compute.provider_tree [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.792 187791 DEBUG nova.scheduler.client.report [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.815 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.863 187791 INFO nova.scheduler.client.report [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Deleted allocations for instance 9deed673-fc96-4e81-b9ba-c3f0e83e1625
Dec 08 20:14:58 compute-0 nova_compute[187787]: 2025-12-08 20:14:58.940 187791 DEBUG oslo_concurrency.lockutils [None req-d7425452-16cf-4c74-a295-e6d47902c02a 5c734374332f4a18956eedb746b128bf 71532308007a48d5aef697fbd39501f6 - - default default] Lock "9deed673-fc96-4e81-b9ba-c3f0e83e1625" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:14:59 compute-0 ovn_controller[96170]: 2025-12-08T20:14:59Z|00057|binding|INFO|Releasing lport 0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b from this chassis (sb_readonly=0)
Dec 08 20:14:59 compute-0 ovn_controller[96170]: 2025-12-08T20:14:59Z|00058|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:14:59 compute-0 nova_compute[187787]: 2025-12-08 20:14:59.624 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:59 compute-0 nova_compute[187787]: 2025-12-08 20:14:59.664 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:14:59 compute-0 podman[202017]: time="2025-12-08T20:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:14:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 25009 "" "Go-http-client/1.1"
Dec 08 20:14:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 4383 "" "Go-http-client/1.1"
Dec 08 20:15:00 compute-0 nova_compute[187787]: 2025-12-08 20:15:00.579 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:00 compute-0 nova_compute[187787]: 2025-12-08 20:15:00.713 187791 DEBUG nova.compute.manager [req-01955974-e21b-4cda-aaf4-fea8894d0d79 req-e05e33be-573e-4d7c-b174-ccb2c938178c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Received event network-vif-deleted-f25bb8cc-9d43-433e-9e69-63da3f5c18ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:00 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:00.992 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: ERROR   20:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: ERROR   20:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: ERROR   20:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: ERROR   20:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: ERROR   20:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:15:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:15:04 compute-0 ovn_controller[96170]: 2025-12-08T20:15:04Z|00059|binding|INFO|Releasing lport 0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b from this chassis (sb_readonly=0)
Dec 08 20:15:04 compute-0 ovn_controller[96170]: 2025-12-08T20:15:04Z|00060|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:15:04 compute-0 nova_compute[187787]: 2025-12-08 20:15:04.259 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:04 compute-0 podman[215379]: 2025-12-08 20:15:04.491516566 +0000 UTC m=+0.055036791 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 08 20:15:04 compute-0 sshd-session[215377]: Received disconnect from 172.96.182.111 port 44936:11: Bye Bye [preauth]
Dec 08 20:15:04 compute-0 nova_compute[187787]: 2025-12-08 20:15:04.714 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:04 compute-0 sshd-session[215377]: Disconnected from authenticating user root 172.96.182.111 port 44936 [preauth]
Dec 08 20:15:05 compute-0 nova_compute[187787]: 2025-12-08 20:15:05.581 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:05 compute-0 ovn_controller[96170]: 2025-12-08T20:15:05Z|00061|binding|INFO|Releasing lport 0c8652ec-61a0-4d40-8e5f-b6b7db09dd2b from this chassis (sb_readonly=0)
Dec 08 20:15:05 compute-0 ovn_controller[96170]: 2025-12-08T20:15:05Z|00062|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:15:05 compute-0 nova_compute[187787]: 2025-12-08 20:15:05.716 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:06 compute-0 podman[215401]: 2025-12-08 20:15:06.483627352 +0000 UTC m=+0.050892741 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Dec 08 20:15:06 compute-0 podman[215400]: 2025-12-08 20:15:06.508901382 +0000 UTC m=+0.078657779 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:15:07 compute-0 nova_compute[187787]: 2025-12-08 20:15:07.807 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765224892.80625, 1e4936a4-4e9a-45e3-9bdb-bd423abc6045 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:07 compute-0 nova_compute[187787]: 2025-12-08 20:15:07.808 187791 INFO nova.compute.manager [-] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] VM Stopped (Lifecycle Event)
Dec 08 20:15:07 compute-0 nova_compute[187787]: 2025-12-08 20:15:07.830 187791 DEBUG nova.compute.manager [None req-7bc662eb-99ba-4275-9bb7-03eecc952bae - - - - - -] [instance: 1e4936a4-4e9a-45e3-9bdb-bd423abc6045] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.445 187791 DEBUG nova.objects.instance [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lazy-loading 'flavor' on Instance uuid 9f228c07-c6ac-479c-9edb-ceebc19eac87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.514 187791 DEBUG oslo_concurrency.lockutils [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.514 187791 DEBUG oslo_concurrency.lockutils [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.601 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765224894.6005275, 9deed673-fc96-4e81-b9ba-c3f0e83e1625 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.602 187791 INFO nova.compute.manager [-] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] VM Stopped (Lifecycle Event)
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.625 187791 DEBUG nova.compute.manager [None req-fa84eb61-89dc-4316-8372-4d5091b59591 - - - - - -] [instance: 9deed673-fc96-4e81-b9ba-c3f0e83e1625] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:09 compute-0 nova_compute[187787]: 2025-12-08 20:15:09.716 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:10 compute-0 nova_compute[187787]: 2025-12-08 20:15:10.583 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:11 compute-0 nova_compute[187787]: 2025-12-08 20:15:11.454 187791 DEBUG nova.network.neutron [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:15:11 compute-0 nova_compute[187787]: 2025-12-08 20:15:11.559 187791 DEBUG nova.compute.manager [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:11 compute-0 nova_compute[187787]: 2025-12-08 20:15:11.559 187791 DEBUG nova.compute.manager [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing instance network info cache due to event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:15:11 compute-0 nova_compute[187787]: 2025-12-08 20:15:11.560 187791 DEBUG oslo_concurrency.lockutils [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.834 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.834 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.834 187791 INFO nova.compute.manager [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Rebooting instance
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.858 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.858 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquired lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:12 compute-0 nova_compute[187787]: 2025-12-08 20:15:12.859 187791 DEBUG nova.network.neutron [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:15:14 compute-0 sshd-session[215439]: Invalid user jenkins from 200.155.38.219 port 48200
Dec 08 20:15:14 compute-0 sshd-session[215439]: Received disconnect from 200.155.38.219 port 48200:11: Bye Bye [preauth]
Dec 08 20:15:14 compute-0 sshd-session[215439]: Disconnected from invalid user jenkins 200.155.38.219 port 48200 [preauth]
Dec 08 20:15:14 compute-0 nova_compute[187787]: 2025-12-08 20:15:14.719 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:15 compute-0 nova_compute[187787]: 2025-12-08 20:15:15.258 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:15 compute-0 podman[215442]: 2025-12-08 20:15:15.553690645 +0000 UTC m=+0.107016215 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:15:15 compute-0 podman[215441]: 2025-12-08 20:15:15.583367281 +0000 UTC m=+0.150554104 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 08 20:15:15 compute-0 nova_compute[187787]: 2025-12-08 20:15:15.585 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.353 187791 DEBUG nova.network.neutron [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.378 187791 DEBUG oslo_concurrency.lockutils [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.379 187791 DEBUG nova.compute.manager [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.379 187791 DEBUG nova.compute.manager [None req-19da9113-e094-45fc-8687-6c21604f376d d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] network_info to inject: |[{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.381 187791 DEBUG oslo_concurrency.lockutils [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.382 187791 DEBUG nova.network.neutron [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:15:16 compute-0 nova_compute[187787]: 2025-12-08 20:15:16.774 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.226 187791 DEBUG nova.network.neutron [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.464 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Releasing lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.466 187791 DEBUG nova.compute.manager [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:17 compute-0 kernel: tap488a5725-c7 (unregistering): left promiscuous mode
Dec 08 20:15:17 compute-0 NetworkManager[56229]: <info>  [1765224917.6665] device (tap488a5725-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:15:17 compute-0 ovn_controller[96170]: 2025-12-08T20:15:17Z|00063|binding|INFO|Releasing lport 488a5725-c797-4165-b8ce-319c48f2e8b8 from this chassis (sb_readonly=0)
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.675 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 ovn_controller[96170]: 2025-12-08T20:15:17Z|00064|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 down in Southbound
Dec 08 20:15:17 compute-0 ovn_controller[96170]: 2025-12-08T20:15:17Z|00065|binding|INFO|Removing iface tap488a5725-c7 ovn-installed in OVS
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.681 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:17.697 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:cd:5c 10.100.0.4'], port_security=['fa:16:3e:6c:cd:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '31b38d28-b90e-434c-9967-912987aee08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d525409-d812-4dc1-bb50-a782007ffe4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bcdae06f-5d1b-4090-b312-569e33431ebf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=488a5725-c797-4165-b8ce-319c48f2e8b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:15:17 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:17.699 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 488a5725-c797-4165-b8ce-319c48f2e8b8 in datapath f378b9ae-fe6a-498a-b0ea-0d98aea69001 unbound from our chassis
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.701 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:17.701 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f378b9ae-fe6a-498a-b0ea-0d98aea69001, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:15:17 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:17.703 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[8e9e580e-42bc-4098-9fee-331fb0c2fc75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:17 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:17.703 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 namespace which is not needed anymore
Dec 08 20:15:17 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 08 20:15:17 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 14.189s CPU time.
Dec 08 20:15:17 compute-0 systemd-machined[154122]: Machine qemu-1-instance-00000001 terminated.
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.889 187791 INFO nova.virt.libvirt.driver [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance destroyed successfully.
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.890 187791 DEBUG nova.objects.instance [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'resources' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.910 187791 DEBUG nova.virt.libvirt.vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:15:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.911 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.912 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.913 187791 DEBUG os_vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.916 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.916 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap488a5725-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.918 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.921 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.925 187791 INFO os_vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7')
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.935 187791 DEBUG nova.virt.libvirt.driver [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start _get_guest_xml network_info=[{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.943 187791 WARNING nova.virt.libvirt.driver [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.952 187791 DEBUG nova.virt.libvirt.host [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.953 187791 DEBUG nova.virt.libvirt.host [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.957 187791 DEBUG nova.virt.libvirt.host [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.958 187791 DEBUG nova.virt.libvirt.host [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.958 187791 DEBUG nova.virt.libvirt.driver [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.959 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.959 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.960 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.960 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.960 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.961 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.961 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.961 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.962 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.962 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.962 187791 DEBUG nova.virt.hardware [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.963 187791 DEBUG nova.objects.instance [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:17 compute-0 nova_compute[187787]: 2025-12-08 20:15:17.983 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.073 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.075 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.076 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.077 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.079 187791 DEBUG nova.virt.libvirt.vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:15:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.080 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.082 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.084 187791 DEBUG nova.objects.instance [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'pci_devices' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.109 187791 DEBUG nova.virt.libvirt.driver [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <uuid>31b38d28-b90e-434c-9967-912987aee08b</uuid>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <name>instance-00000001</name>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:name>tempest-ServerActionsTestJSON-server-662159316</nova:name>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:15:17</nova:creationTime>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:user uuid="6e0c4248254a4bcb850e5443f0b8ad8b">tempest-ServerActionsTestJSON-1895647313-project-member</nova:user>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:project uuid="ad1a4d6aebb84f6fb894551cd68d2ae1">tempest-ServerActionsTestJSON-1895647313</nova:project>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         <nova:port uuid="488a5725-c797-4165-b8ce-319c48f2e8b8">
Dec 08 20:15:18 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <system>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="serial">31b38d28-b90e-434c-9967-912987aee08b</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="uuid">31b38d28-b90e-434c-9967-912987aee08b</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </system>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <os>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </os>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <features>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </features>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk.config"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:6c:cd:5c"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <target dev="tap488a5725-c7"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/console.log" append="off"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <video>
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </video>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <input type="keyboard" bus="usb"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:15:18 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:15:18 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:15:18 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:15:18 compute-0 nova_compute[187787]: </domain>
Dec 08 20:15:18 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.111 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.188 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.190 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.216 187791 DEBUG nova.compute.manager [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-unplugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.217 187791 DEBUG oslo_concurrency.lockutils [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.217 187791 DEBUG oslo_concurrency.lockutils [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.219 187791 DEBUG oslo_concurrency.lockutils [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.219 187791 DEBUG nova.compute.manager [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] No waiting events found dispatching network-vif-unplugged-488a5725-c797-4165-b8ce-319c48f2e8b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.219 187791 WARNING nova.compute.manager [req-0f76692d-2543-4e7d-b0ae-031fe179dfbe req-5d1de6f2-098c-489b-b21e-9aab037a8173 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received unexpected event network-vif-unplugged-488a5725-c797-4165-b8ce-319c48f2e8b8 for instance with vm_state active and task_state reboot_started_hard.
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.255 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.257 187791 DEBUG nova.objects.instance [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [NOTICE]   (214813) : haproxy version is 2.8.14-c23fe91
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [NOTICE]   (214813) : path to executable is /usr/sbin/haproxy
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [WARNING]  (214813) : Exiting Master process...
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [WARNING]  (214813) : Exiting Master process...
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [ALERT]    (214813) : Current worker (214815) exited with code 143 (Terminated)
Dec 08 20:15:18 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[214809]: [WARNING]  (214813) : All workers exited. Exiting... (0)
Dec 08 20:15:18 compute-0 systemd[1]: libpod-d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552.scope: Deactivated successfully.
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.278 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:18 compute-0 podman[215512]: 2025-12-08 20:15:18.285487382 +0000 UTC m=+0.441976221 container died d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.350 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.351 187791 DEBUG nova.virt.disk.api [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Checking if we can resize image /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.351 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.405 187791 DEBUG oslo_concurrency.processutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.405 187791 DEBUG nova.virt.disk.api [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Cannot resize image /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.406 187791 DEBUG nova.objects.instance [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'migration_context' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.425 187791 DEBUG nova.virt.libvirt.vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:15:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.426 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.426 187791 DEBUG nova.network.os_vif_util [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.427 187791 DEBUG os_vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.427 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.428 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.428 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.431 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.431 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap488a5725-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.432 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap488a5725-c7, col_values=(('external_ids', {'iface-id': '488a5725-c797-4165-b8ce-319c48f2e8b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6c:cd:5c', 'vm-uuid': '31b38d28-b90e-434c-9967-912987aee08b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.433 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 NetworkManager[56229]: <info>  [1765224918.4345] manager: (tap488a5725-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.436 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.439 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.441 187791 INFO os_vif [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7')
Dec 08 20:15:18 compute-0 kernel: tap488a5725-c7: entered promiscuous mode
Dec 08 20:15:18 compute-0 NetworkManager[56229]: <info>  [1765224918.6509] manager: (tap488a5725-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Dec 08 20:15:18 compute-0 systemd-udevd[215494]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:15:18 compute-0 ovn_controller[96170]: 2025-12-08T20:15:18Z|00066|binding|INFO|Claiming lport 488a5725-c797-4165-b8ce-319c48f2e8b8 for this chassis.
Dec 08 20:15:18 compute-0 ovn_controller[96170]: 2025-12-08T20:15:18Z|00067|binding|INFO|488a5725-c797-4165-b8ce-319c48f2e8b8: Claiming fa:16:3e:6c:cd:5c 10.100.0.4
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.654 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 NetworkManager[56229]: <info>  [1765224918.6749] device (tap488a5725-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:15:18 compute-0 NetworkManager[56229]: <info>  [1765224918.6757] device (tap488a5725-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:15:18 compute-0 ovn_controller[96170]: 2025-12-08T20:15:18Z|00068|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 ovn-installed in OVS
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.686 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 nova_compute[187787]: 2025-12-08 20:15:18.690 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:18 compute-0 systemd-machined[154122]: New machine qemu-5-instance-00000001.
Dec 08 20:15:18 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000001.
Dec 08 20:15:18 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552-userdata-shm.mount: Deactivated successfully.
Dec 08 20:15:18 compute-0 systemd[1]: var-lib-containers-storage-overlay-589ed7e28ecf9e04ca4cfd0159b4dba4a46e82d290a02dc79701646f72c30c5d-merged.mount: Deactivated successfully.
Dec 08 20:15:18 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:18.996 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:cd:5c 10.100.0.4'], port_security=['fa:16:3e:6c:cd:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '31b38d28-b90e-434c-9967-912987aee08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7d525409-d812-4dc1-bb50-a782007ffe4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bcdae06f-5d1b-4090-b312-569e33431ebf, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=488a5725-c797-4165-b8ce-319c48f2e8b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:15:18 compute-0 ovn_controller[96170]: 2025-12-08T20:15:18Z|00069|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 up in Southbound
Dec 08 20:15:19 compute-0 podman[215555]: 2025-12-08 20:15:19.002915799 +0000 UTC m=+0.693613093 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.080 187791 DEBUG nova.virt.libvirt.host [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Removed pending event for 31b38d28-b90e-434c-9967-912987aee08b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.081 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224919.0804315, 31b38d28-b90e-434c-9967-912987aee08b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.081 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Resumed (Lifecycle Event)
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.083 187791 DEBUG nova.compute.manager [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.086 187791 INFO nova.virt.libvirt.driver [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance rebooted successfully.
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.086 187791 DEBUG nova.compute.manager [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.125 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.128 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:15:19 compute-0 podman[215512]: 2025-12-08 20:15:19.136694079 +0000 UTC m=+1.293182918 container cleanup d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 08 20:15:19 compute-0 systemd[1]: libpod-conmon-d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552.scope: Deactivated successfully.
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.172 187791 DEBUG oslo_concurrency.lockutils [None req-bb44c253-9c4c-4ee2-a215-e5fcd0a4fdac 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.175 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224919.0829833, 31b38d28-b90e-434c-9967-912987aee08b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.175 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Started (Lifecycle Event)
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.204 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.208 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:15:19 compute-0 podman[215629]: 2025-12-08 20:15:19.410662249 +0000 UTC m=+0.246350398 container remove d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.418 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[a0566d52-a160-451a-b5bb-cb4d80dac1c0]: (4, ('Mon Dec  8 08:15:17 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 (d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552)\nd019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552\nMon Dec  8 08:15:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 (d019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552)\nd019b3a0b1ba8395a108f12f6bd5e6456c25bf868abad2266115207e4e342552\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.419 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[9fdf2559-c3c2-41ee-a475-dfe697974b83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.420 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf378b9ae-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.464 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:19 compute-0 kernel: tapf378b9ae-f0: left promiscuous mode
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.486 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.489 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[26e775ac-2c07-4415-b55b-37e0e0c83363]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.503 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[adb448dd-e41c-47ed-8d32-27455c2f30aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.504 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[713d26db-c3fa-44b0-b568-bbf43b301bc0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.522 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[62f020ee-f59f-4674-88dc-e2d136ddf4c8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341738, 'reachable_time': 23011, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215646, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.524 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:15:19 compute-0 systemd[1]: run-netns-ovnmeta\x2df378b9ae\x2dfe6a\x2d498a\x2db0ea\x2d0d98aea69001.mount: Deactivated successfully.
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.525 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[bddc8ca2-8127-401b-9e0b-608e8b525bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.525 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 488a5725-c797-4165-b8ce-319c48f2e8b8 in datapath f378b9ae-fe6a-498a-b0ea-0d98aea69001 unbound from our chassis
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.527 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.539 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f375b4cc-8572-4584-bb9e-0ab08cf3c052]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.540 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf378b9ae-f1 in ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.542 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf378b9ae-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.542 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[965a33da-46ff-4df1-b5cc-26a2558217e0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.543 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[a7c19426-bf24-4237-9a9d-3fac003416ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.554 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[bafd474f-346a-44b9-bc6e-a1922947021e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.555 187791 DEBUG nova.objects.instance [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lazy-loading 'flavor' on Instance uuid 9f228c07-c6ac-479c-9edb-ceebc19eac87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.568 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[844e3d19-3f80-426f-9de5-a6f30a475816]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.590 187791 DEBUG oslo_concurrency.lockutils [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.595 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a56c82-70b4-4bc3-aaf6-cf61e68f9311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 NetworkManager[56229]: <info>  [1765224919.6042] manager: (tapf378b9ae-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.605 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[c0f27996-18ed-4275-a9a0-947e0d8c7287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.633 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[4b44a3f9-4e2a-48f5-a9c7-c806826e28fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.637 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[0a7b2477-a42a-4dab-af28-27441a020ce3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 NetworkManager[56229]: <info>  [1765224919.6587] device (tapf378b9ae-f0): carrier: link connected
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.665 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[a3ca6948-b33d-44aa-b507-445ae6ccef8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.683 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f6c7e436-12bc-4344-a104-e5ea1d35213d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf378b9ae-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:1a:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345725, 'reachable_time': 38489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215671, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.699 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[882cd29e-667b-4065-8c34-e357d6bc8a92]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe83:1afe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345725, 'tstamp': 345725}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215672, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.716 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[abdbe3bc-e60f-4b8c-ae88-afe970721862]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf378b9ae-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:83:1a:fe'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345725, 'reachable_time': 38489, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215673, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.749 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f1726f6e-b655-44f3-8be1-227df4358d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.825 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.826 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.826 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.827 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.829 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb952c8-9ae6-4cc9-9d30-bc728c12a334]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.830 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf378b9ae-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.831 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.831 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf378b9ae-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:19 compute-0 NetworkManager[56229]: <info>  [1765224919.8333] manager: (tapf378b9ae-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 08 20:15:19 compute-0 kernel: tapf378b9ae-f0: entered promiscuous mode
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.834 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.835 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.835 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.836 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.836 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf378b9ae-f0, col_values=(('external_ids', {'iface-id': 'e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ovn_controller[96170]: 2025-12-08T20:15:19Z|00070|binding|INFO|Releasing lport e1bfa09e-e9ff-44b2-80e4-f07b8a208bf6 from this chassis (sb_readonly=0)
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.837 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac94830>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:15:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:19.844 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance 31b38d28-b90e-434c-9967-912987aee08b from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.851 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.853 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.853 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4c31e3db-8a75-4c76-83fc-f41089340355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.854 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/f378b9ae-fe6a-498a-b0ea-0d98aea69001.pid.haproxy
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID f378b9ae-fe6a-498a-b0ea-0d98aea69001
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:15:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:19.855 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'env', 'PROCESS_TAG=haproxy-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f378b9ae-fe6a-498a-b0ea-0d98aea69001.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:15:19 compute-0 nova_compute[187787]: 2025-12-08 20:15:19.950 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.027 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.028 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.077 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.082 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.135 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.136 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.199 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:15:20 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:20.265 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/31b38d28-b90e-434c-9967-912987aee08b -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb2addbdb7eb2243f56077a2f33f0d742f2394f2798d43299bc395c4e79fc277" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Dec 08 20:15:20 compute-0 podman[215717]: 2025-12-08 20:15:20.265997745 +0000 UTC m=+0.053072569 container create aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:15:20 compute-0 systemd[1]: Started libpod-conmon-aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e.scope.
Dec 08 20:15:20 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:15:20 compute-0 podman[215717]: 2025-12-08 20:15:20.238501846 +0000 UTC m=+0.025576690 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:15:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be619dd8d0b92127e41ca678434bbb33e9edcfb45ae64de413dbc8a4916efea6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:15:20 compute-0 podman[215717]: 2025-12-08 20:15:20.404532923 +0000 UTC m=+0.191607747 container init aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:15:20 compute-0 podman[215717]: 2025-12-08 20:15:20.411904634 +0000 UTC m=+0.198979458 container start aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.417 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.418 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5439MB free_disk=72.8238639831543GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.418 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.419 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:20 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [NOTICE]   (215738) : New worker (215740) forked
Dec 08 20:15:20 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [NOTICE]   (215738) : Loading success.
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.439 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.439 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.439 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.440 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.440 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] No waiting events found dispatching network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.440 187791 WARNING nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received unexpected event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 for instance with vm_state active and task_state None.
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.441 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.441 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.441 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.442 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.442 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] No waiting events found dispatching network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.442 187791 WARNING nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received unexpected event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 for instance with vm_state active and task_state None.
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.442 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.443 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.443 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.443 187791 DEBUG oslo_concurrency.lockutils [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.443 187791 DEBUG nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] No waiting events found dispatching network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.444 187791 WARNING nova.compute.manager [req-eb44fc6c-ff73-434d-81db-459c7cde5121 req-41658438-6124-4742-adaf-bfdd239cbcb3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received unexpected event network-vif-plugged-488a5725-c797-4165-b8ce-319c48f2e8b8 for instance with vm_state active and task_state None.
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.528 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Instance 31b38d28-b90e-434c-9967-912987aee08b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.528 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Instance 9f228c07-c6ac-479c-9edb-ceebc19eac87 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.528 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.528 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.624 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.630 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.657 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.690 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:15:20 compute-0 nova_compute[187787]: 2025-12-08 20:15:20.691 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.167 187791 DEBUG nova.network.neutron [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updated VIF entry in instance network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.167 187791 DEBUG nova.network.neutron [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}, {"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.195 187791 DEBUG oslo_concurrency.lockutils [req-6ebf4012-b56e-41ad-8025-e8244fbf7f26 req-38e110d7-73a8-404a-a90b-988f892b24d1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.195 187791 DEBUG oslo_concurrency.lockutils [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.325 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 1978 Content-Type: application/json Date: Mon, 08 Dec 2025 20:15:20 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-68532315-7baa-4b04-94b6-b91609e050b7 x-openstack-request-id: req-68532315-7baa-4b04-94b6-b91609e050b7 _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.326 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "31b38d28-b90e-434c-9967-912987aee08b", "name": "tempest-ServerActionsTestJSON-server-662159316", "status": "ACTIVE", "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "user_id": "6e0c4248254a4bcb850e5443f0b8ad8b", "metadata": {}, "hostId": "6f2843364e7b88f9c173ac70e4abaabde3aec38b114e7f65b51cd5a4", "image": {"id": "ffae60d8-1843-4b3a-9d11-b077095cedb9", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/ffae60d8-1843-4b3a-9d11-b077095cedb9"}]}, "flavor": {"id": "2f15909f-e95c-4c15-b311-ac90858a554d", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/2f15909f-e95c-4c15-b311-ac90858a554d"}]}, "created": "2025-12-08T20:14:20Z", "updated": "2025-12-08T20:15:19Z", "addresses": {"tempest-ServerActionsTestJSON-2051911313-network": [{"version": 4, "addr": "10.100.0.4", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:6c:cd:5c"}, {"version": 4, "addr": "192.168.122.202", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:6c:cd:5c"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/31b38d28-b90e-434c-9967-912987aee08b"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/31b38d28-b90e-434c-9967-912987aee08b"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-keypair-940238471", "OS-SRV-USG:launched_at": "2025-12-08T20:14:42.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--278852784"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000001", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.326 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/31b38d28-b90e-434c-9967-912987aee08b used request id req-68532315-7baa-4b04-94b6-b91609e050b7 request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.327 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '31b38d28-b90e-434c-9967-912987aee08b', 'name': 'tempest-ServerActionsTestJSON-server-662159316', 'flavor': {'id': '2f15909f-e95c-4c15-b311-ac90858a554d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'user_id': '6e0c4248254a4bcb850e5443f0b8ad8b', 'hostId': '6f2843364e7b88f9c173ac70e4abaabde3aec38b114e7f65b51cd5a4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.330 14 DEBUG ceilometer.compute.discovery [-] Querying metadata for instance 9f228c07-c6ac-479c-9edb-ceebc19eac87 from Nova API get_server /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:176
Dec 08 20:15:21 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:21.331 14 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/servers/9f228c07-c6ac-479c-9edb-ceebc19eac87 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}bb2addbdb7eb2243f56077a2f33f0d742f2394f2798d43299bc395c4e79fc277" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:572
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.691 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.692 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:15:21 compute-0 nova_compute[187787]: 2025-12-08 20:15:21.692 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:15:22 compute-0 nova_compute[187787]: 2025-12-08 20:15:22.092 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:22 compute-0 nova_compute[187787]: 2025-12-08 20:15:22.092 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquired lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:22 compute-0 nova_compute[187787]: 2025-12-08 20:15:22.093 187791 DEBUG nova.network.neutron [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Dec 08 20:15:22 compute-0 nova_compute[187787]: 2025-12-08 20:15:22.093 187791 DEBUG nova.objects.instance [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:22 compute-0 nova_compute[187787]: 2025-12-08 20:15:22.706 187791 DEBUG nova.network.neutron [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.992 14 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 2112 Content-Type: application/json Date: Mon, 08 Dec 2025 20:15:21 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d911da46-3c23-409b-b17a-923f27843b69 x-openstack-request-id: req-d911da46-3c23-409b-b17a-923f27843b69 _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:613
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.992 14 DEBUG novaclient.v2.client [-] RESP BODY: {"server": {"id": "9f228c07-c6ac-479c-9edb-ceebc19eac87", "name": "tempest-AttachInterfacesUnderV243Test-server-1776949335", "status": "ACTIVE", "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "user_id": "d1c8c9756a134cd7a38cb55743f12dad", "metadata": {}, "hostId": "946685139e07b9cb6a73619b1aceac79d5422379f86fbef05fd36702", "image": {"id": "ffae60d8-1843-4b3a-9d11-b077095cedb9", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/images/ffae60d8-1843-4b3a-9d11-b077095cedb9"}]}, "flavor": {"id": "2f15909f-e95c-4c15-b311-ac90858a554d", "links": [{"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/2f15909f-e95c-4c15-b311-ac90858a554d"}]}, "created": "2025-12-08T20:14:24Z", "updated": "2025-12-08T20:15:16Z", "addresses": {"tempest-AttachInterfacesUnderV243Test-348537151-network": [{"version": 4, "addr": "10.100.0.10", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:d3:0b:70"}, {"version": 4, "addr": "10.100.0.11", "OS-EXT-IPS:type": "fixed", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:d3:0b:70"}, {"version": 4, "addr": "192.168.122.187", "OS-EXT-IPS:type": "floating", "OS-EXT-IPS-MAC:mac_addr": "fa:16:3e:d3:0b:70"}]}, "accessIPv4": "", "accessIPv6": "", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/servers/9f228c07-c6ac-479c-9edb-ceebc19eac87"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/servers/9f228c07-c6ac-479c-9edb-ceebc19eac87"}], "OS-DCF:diskConfig": "MANUAL", "progress": 0, "OS-EXT-AZ:availability_zone": "nova", "config_drive": "True", "key_name": "tempest-keypair-1472805335", "OS-SRV-USG:launched_at": "2025-12-08T20:14:42.000000", "OS-SRV-USG:terminated_at": null, "security_groups": [{"name": "tempest-securitygroup--1938495777"}], "OS-EXT-SRV-ATTR:host": "compute-0.ctlplane.example.com", "OS-EXT-SRV-ATTR:instance_name": "instance-00000003", "OS-EXT-SRV-ATTR:hypervisor_hostname": "compute-0.ctlplane.example.com", "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "active", "OS-EXT-STS:power_state": 1, "os-extended-volumes:volumes_attached": []}} _http_log_response /usr/lib/python3.12/site-packages/keystoneauth1/session.py:648
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.993 14 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/servers/9f228c07-c6ac-479c-9edb-ceebc19eac87 used request id req-d911da46-3c23-409b-b17a-923f27843b69 request /usr/lib/python3.12/site-packages/keystoneauth1/session.py:1073
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.994 14 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9f228c07-c6ac-479c-9edb-ceebc19eac87', 'name': 'tempest-AttachInterfacesUnderV243Test-server-1776949335', 'flavor': {'id': '2f15909f-e95c-4c15-b311-ac90858a554d', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-0.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '73150461bb354f0fb8f4adf266d52ac8', 'user_id': 'd1c8c9756a134cd7a38cb55743f12dad', 'hostId': '946685139e07b9cb6a73619b1aceac79d5422379f86fbef05fd36702', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.12/site-packages/ceilometer/compute/discovery.py:315
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.994 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.995 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.995 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.996 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:22 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:22.998 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.latency (2025-12-08T20:15:22.995950) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.050 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.latency volume: 235312211 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.052 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.latency volume: 249897 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.091 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.latency volume: 321777261 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.091 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.latency volume: 24662993 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.092 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.latency in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.093 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.093 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.093 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.093 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.093 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.094 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.usage (2025-12-08T20:15:23.093449) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.109 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.109 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.121 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.121 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.usage volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.122 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.usage in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.122 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.122 14 INFO ceilometer.polling.manager [-] Polling pollster power.state in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.123 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.123 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.123 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for power.state (2025-12-08T20:15:23.123241) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.123 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: power.state heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.141 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/power.state volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.158 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/power.state volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 INFO ceilometer.polling.manager [-] Finished polling pollster power.state in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.160 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.bytes (2025-12-08T20:15:23.159848) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.159 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.160 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.160 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.160 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.bytes volume: 72949760 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.161 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.161 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.161 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.latency (2025-12-08T20:15:23.162255) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.latency heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.162 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.163 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.latency volume: 3518560976 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.163 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.163 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.latency in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.rate (2025-12-08T20:15:23.164424) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.164 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.165 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-662159316>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-1776949335>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-662159316>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-1776949335>]
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.write.requests (2025-12-08T20:15:23.166608) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.166 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.write.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.167 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.167 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.167 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.requests volume: 313 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.167 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.168 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.write.requests in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.168 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.168 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.168 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.169 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.169 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets (2025-12-08T20:15:23.169212) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.169 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.172 14 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 31b38d28-b90e-434c-9967-912987aee08b / tap488a5725-c7 inspect_vnics /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/inspector.py:143
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.172 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.175 14 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9f228c07-c6ac-479c-9edb-ceebc19eac87 / tapaf16ddfd-01 inspect_vnics /usr/lib/python3.12/site-packages/ceilometer/compute/virt/libvirt/inspector.py:143
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.175 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.incoming.packets volume: 26 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 INFO ceilometer.polling.manager [-] Polling pollster disk.ephemeral.size in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.177 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.ephemeral.size (2025-12-08T20:15:23.176774) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.176 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.ephemeral.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.177 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.ephemeral.size in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.drop (2025-12-08T20:15:23.178486) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.178 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.179 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.179 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.drop in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.179 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.180 14 INFO ceilometer.polling.manager [-] Polling pollster disk.root.size in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.180 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.180 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.180 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.root.size (2025-12-08T20:15:23.180375) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.180 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.root.size heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.181 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.root.size in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.181 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.181 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.181 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.181 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.182 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.allocation (2025-12-08T20:15:23.182064) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.182 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.allocation heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.182 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.182 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.183 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.183 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.allocation volume: 512000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.allocation in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.packets.error (2025-12-08T20:15:23.184642) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.184 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.185 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.185 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.185 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.packets.error in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes.delta (2025-12-08T20:15:23.186669) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.186 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.187 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.187 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes (2025-12-08T20:15:23.188625) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.188 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.189 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.189 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.outgoing.bytes volume: 3390 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.189 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.requests (2025-12-08T20:15:23.190540) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.190 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.requests heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.191 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.191 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.191 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.requests volume: 1075 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.191 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.requests volume: 120 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.192 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.requests in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.192 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.192 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.192 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.193 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.193 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.delta (2025-12-08T20:15:23.193173) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.193 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.delta heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.193 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.193 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.194 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.194 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.194 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.194 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.195 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.195 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.bytes.rate (2025-12-08T20:15:23.195126) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.195 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.bytes.rate heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.195 14 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:162
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.195 14 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-ServerActionsTestJSON-server-662159316>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-1776949335>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-ServerActionsTestJSON-server-662159316>, <NovaLikeServer: tempest-AttachInterfacesUnderV243Test-server-1776949335>]
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 14 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for memory.usage (2025-12-08T20:15:23.196583) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.196 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: memory.usage heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.197 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.197 14 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 31b38d28-b90e-434c-9967-912987aee08b: ceilometer.compute.pollsters.NoVolumeException
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.197 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/memory.usage volume: 46.65234375 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.197 14 INFO ceilometer.polling.manager [-] Finished polling pollster memory.usage in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.197 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets (2025-12-08T20:15:23.198337) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.198 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.outgoing.packets volume: 28 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.200 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.incoming.bytes (2025-12-08T20:15:23.199852) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.199 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.incoming.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.200 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.200 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.incoming.bytes volume: 4195 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.200 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.incoming.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.200 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for cpu (2025-12-08T20:15:23.201325) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: cpu heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/cpu volume: 3780000000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.201 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/cpu volume: 12050000000 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 INFO ceilometer.polling.manager [-] Finished polling pollster cpu in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.drop (2025-12-08T20:15:23.202760) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.202 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.drop heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.203 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.203 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.203 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for network.outgoing.packets.error (2025-12-08T20:15:23.204410) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: network.outgoing.packets.error heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.204 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 INFO ceilometer.polling.manager [-] Finished polling pollster network.outgoing.packets.error in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.206 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.capacity (2025-12-08T20:15:23.205919) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.205 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.capacity heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.206 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.206 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.206 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.capacity volume: 509952 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.capacity in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 DEBUG ceilometer.polling.manager [-] Checking if we need coordination for pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] with coordination group name [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:333
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.207 14 DEBUG ceilometer.polling.manager [-] The pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] is not configured in a source for polling that requires coordination. The current hashrings are the following [None]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:355
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.208 12 DEBUG ceilometer.polling.manager [-] Updated heartbeat for disk.device.read.bytes (2025-12-08T20:15:23.208036) _update_status /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:502
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.208 14 DEBUG ceilometer.polling.manager [-] Polster heartbeat update: disk.device.read.bytes heartbeat /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:636
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.208 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.208 14 DEBUG ceilometer.compute.pollsters [-] 31b38d28-b90e-434c-9967-912987aee08b/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.208 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.bytes volume: 29997568 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.209 14 DEBUG ceilometer.compute.pollsters [-] 9f228c07-c6ac-479c-9edb-ceebc19eac87/disk.device.read.bytes volume: 299326 _stats_to_sample /usr/lib/python3.12/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.209 14 INFO ceilometer.polling.manager [-] Finished polling pollster disk.device.read.bytes in the context of pollsters
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.209 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.210 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.210 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.210 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.210 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.210 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.211 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.212 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.213 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.214 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:15:23.214 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:15:23 compute-0 nova_compute[187787]: 2025-12-08 20:15:23.378 187791 DEBUG nova.compute.manager [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:23 compute-0 nova_compute[187787]: 2025-12-08 20:15:23.379 187791 DEBUG nova.compute.manager [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing instance network info cache due to event network-changed-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:15:23 compute-0 nova_compute[187787]: 2025-12-08 20:15:23.379 187791 DEBUG oslo_concurrency.lockutils [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:15:23 compute-0 nova_compute[187787]: 2025-12-08 20:15:23.435 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.047 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.567 187791 DEBUG nova.network.neutron [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [{"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.587 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Releasing lock "refresh_cache-31b38d28-b90e-434c-9967-912987aee08b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.587 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.587 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.587 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.588 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.588 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:24 compute-0 nova_compute[187787]: 2025-12-08 20:15:24.588 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.195 187791 DEBUG nova.network.neutron [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.219 187791 DEBUG oslo_concurrency.lockutils [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.219 187791 DEBUG nova.compute.manager [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.220 187791 DEBUG nova.compute.manager [None req-b325bca3-fb4d-45aa-98f9-4b193e4fdb36 d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] network_info to inject: |[{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.222 187791 DEBUG oslo_concurrency.lockutils [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.222 187791 DEBUG nova.network.neutron [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Refreshing network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:15:25 compute-0 nova_compute[187787]: 2025-12-08 20:15:25.627 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.053 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.055 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.056 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.056 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.056 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.057 187791 INFO nova.compute.manager [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Terminating instance
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.059 187791 DEBUG nova.compute.manager [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:15:26 compute-0 kernel: tapaf16ddfd-01 (unregistering): left promiscuous mode
Dec 08 20:15:26 compute-0 NetworkManager[56229]: <info>  [1765224926.2530] device (tapaf16ddfd-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:15:26 compute-0 ovn_controller[96170]: 2025-12-08T20:15:26Z|00071|binding|INFO|Releasing lport af16ddfd-01f8-4225-96a8-8ec9a5aa19ba from this chassis (sb_readonly=0)
Dec 08 20:15:26 compute-0 ovn_controller[96170]: 2025-12-08T20:15:26Z|00072|binding|INFO|Setting lport af16ddfd-01f8-4225-96a8-8ec9a5aa19ba down in Southbound
Dec 08 20:15:26 compute-0 ovn_controller[96170]: 2025-12-08T20:15:26Z|00073|binding|INFO|Removing iface tapaf16ddfd-01 ovn-installed in OVS
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.266 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.283 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d3:0b:70 10.100.0.10'], port_security=['fa:16:3e:d3:0b:70 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '9f228c07-c6ac-479c-9edb-ceebc19eac87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66073b60-2cee-4d92-b656-15d29787b3b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '73150461bb354f0fb8f4adf266d52ac8', 'neutron:revision_number': '6', 'neutron:security_group_ids': '128a4b39-b2dc-478a-8a31-e528cec44116', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.187'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9088ad92-ddfd-4933-9885-66eab30c7262, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.284 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.285 105024 INFO neutron.agent.ovn.metadata.agent [-] Port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba in datapath 66073b60-2cee-4d92-b656-15d29787b3b5 unbound from our chassis
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.286 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66073b60-2cee-4d92-b656-15d29787b3b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.289 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[9ce05a77-e4c2-49fa-8c3d-f6345e47735b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.290 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5 namespace which is not needed anymore
Dec 08 20:15:26 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Deactivated successfully.
Dec 08 20:15:26 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000003.scope: Consumed 14.725s CPU time.
Dec 08 20:15:26 compute-0 systemd-machined[154122]: Machine qemu-3-instance-00000003 terminated.
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [NOTICE]   (214966) : haproxy version is 2.8.14-c23fe91
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [NOTICE]   (214966) : path to executable is /usr/sbin/haproxy
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [WARNING]  (214966) : Exiting Master process...
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [WARNING]  (214966) : Exiting Master process...
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.486 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [ALERT]    (214966) : Current worker (214968) exited with code 143 (Terminated)
Dec 08 20:15:26 compute-0 neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5[214962]: [WARNING]  (214966) : All workers exited. Exiting... (0)
Dec 08 20:15:26 compute-0 systemd[1]: libpod-4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911.scope: Deactivated successfully.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.493 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 podman[215774]: 2025-12-08 20:15:26.496364897 +0000 UTC m=+0.105375613 container died 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 08 20:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911-userdata-shm.mount: Deactivated successfully.
Dec 08 20:15:26 compute-0 systemd[1]: var-lib-containers-storage-overlay-2c429656d1c4f4b073eac50c2d6d9fbfd370216964d3fdfbe354849a539a6166-merged.mount: Deactivated successfully.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.533 187791 INFO nova.virt.libvirt.driver [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance destroyed successfully.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.534 187791 DEBUG nova.objects.instance [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lazy-loading 'resources' on Instance uuid 9f228c07-c6ac-479c-9edb-ceebc19eac87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:26 compute-0 podman[215774]: 2025-12-08 20:15:26.537655268 +0000 UTC m=+0.146665984 container cleanup 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:15:26 compute-0 systemd[1]: libpod-conmon-4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911.scope: Deactivated successfully.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.554 187791 DEBUG nova.virt.libvirt.vif [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesUnderV243Test-server-1776949335',display_name='tempest-AttachInterfacesUnderV243Test-server-1776949335',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-attachinterfacesunderv243test-server-1776949335',id=3,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHUXKCLdMozg5QacQQMDuqiDmsucELgm2tHYaW7qA0wQwZkWq396yhR854vKmw7vKOgqLiXWTJUWaZ6YubHL3+CGLnFRSVD05P9VGevvVx9c62su/GSL+XmKXtiO6ODbEg==',key_name='tempest-keypair-1472805335',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='73150461bb354f0fb8f4adf266d52ac8',ramdisk_id='',reservation_id='r-rkf8jk67',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesUnderV243Test-1745682393',owner_user_name='tempest-AttachInterfacesUnderV243Test-1745682393-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:15:25Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='d1c8c9756a134cd7a38cb55743f12dad',uuid=9f228c07-c6ac-479c-9edb-ceebc19eac87,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.554 187791 DEBUG nova.network.os_vif_util [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converting VIF {"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.555 187791 DEBUG nova.network.os_vif_util [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.555 187791 DEBUG os_vif [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.557 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.557 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf16ddfd-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.558 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.560 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.566 187791 INFO os_vif [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d3:0b:70,bridge_name='br-int',has_traffic_filtering=True,id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba,network=Network(66073b60-2cee-4d92-b656-15d29787b3b5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf16ddfd-01')
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.566 187791 INFO nova.virt.libvirt.driver [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Deleting instance files /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87_del
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.567 187791 INFO nova.virt.libvirt.driver [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Deletion of /var/lib/nova/instances/9f228c07-c6ac-479c-9edb-ceebc19eac87_del complete
Dec 08 20:15:26 compute-0 podman[215817]: 2025-12-08 20:15:26.596787366 +0000 UTC m=+0.040502237 container remove 4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.600 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5c95a5-ab9e-4b4f-973a-7e6f0412d3ff]: (4, ('Mon Dec  8 08:15:26 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5 (4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911)\n4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911\nMon Dec  8 08:15:26 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5 (4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911)\n4b5626326a21cb8777a8c63177525ce62b35473dde9b2d1620ad98e5b5926911\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.604 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[1a54a3da-6131-4a1e-81bf-ec1c4ffc75b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.605 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap66073b60-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:26 compute-0 kernel: tap66073b60-20: left promiscuous mode
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.612 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.613 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[e080b28a-fdcd-4c0b-8101-f5218017c76d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.624 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.632 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b42d351c-681c-4e36-9ecb-b189fc0b7fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.633 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[269a323d-868e-4435-b8e7-69655d84b9d2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.648 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8a2b2b-6c90-44e4-952b-eac15a23042f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 341903, 'reachable_time': 30305, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215833, 'error': None, 'target': 'ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.651 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-66073b60-2cee-4d92-b656-15d29787b3b5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:15:26 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:26.651 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[de04d837-a9b5-491d-8c03-ae92343df32f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:26 compute-0 systemd[1]: run-netns-ovnmeta\x2d66073b60\x2d2cee\x2d4d92\x2db656\x2d15d29787b3b5.mount: Deactivated successfully.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.676 187791 INFO nova.compute.manager [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Took 0.62 seconds to destroy the instance on the hypervisor.
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.677 187791 DEBUG oslo.service.loopingcall [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.678 187791 DEBUG nova.compute.manager [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.678 187791 DEBUG nova.network.neutron [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.864 187791 DEBUG nova.compute.manager [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-unplugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.864 187791 DEBUG oslo_concurrency.lockutils [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.865 187791 DEBUG oslo_concurrency.lockutils [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.865 187791 DEBUG oslo_concurrency.lockutils [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.866 187791 DEBUG nova.compute.manager [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] No waiting events found dispatching network-vif-unplugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:26 compute-0 nova_compute[187787]: 2025-12-08 20:15:26.866 187791 DEBUG nova.compute.manager [req-ded9f18a-c807-4115-9d4f-96da40f9a5e3 req-968ee0cd-ea72-4607-9e48-baaab7bdb678 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-unplugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:15:28 compute-0 podman[215834]: 2025-12-08 20:15:28.547821787 +0000 UTC m=+0.111313349 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4)
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.311 187791 DEBUG nova.compute.manager [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.312 187791 DEBUG oslo_concurrency.lockutils [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.312 187791 DEBUG oslo_concurrency.lockutils [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.312 187791 DEBUG oslo_concurrency.lockutils [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.312 187791 DEBUG nova.compute.manager [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] No waiting events found dispatching network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.313 187791 WARNING nova.compute.manager [req-d3c10757-5129-451a-89c3-1d699e3afd33 req-c25befe1-1c58-493f-a2a5-8c05b02a72e8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received unexpected event network-vif-plugged-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba for instance with vm_state active and task_state deleting.
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.530 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.537 187791 DEBUG nova.network.neutron [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.566 187791 DEBUG nova.network.neutron [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updated VIF entry in instance network info cache for port af16ddfd-01f8-4225-96a8-8ec9a5aa19ba. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.566 187791 DEBUG nova.network.neutron [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Updating instance_info_cache with network_info: [{"id": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "address": "fa:16:3e:d3:0b:70", "network": {"id": "66073b60-2cee-4d92-b656-15d29787b3b5", "bridge": "br-int", "label": "tempest-AttachInterfacesUnderV243Test-348537151-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.187", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "73150461bb354f0fb8f4adf266d52ac8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf16ddfd-01", "ovs_interfaceid": "af16ddfd-01f8-4225-96a8-8ec9a5aa19ba", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.602 187791 INFO nova.compute.manager [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Took 2.92 seconds to deallocate network for instance.
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.648 187791 DEBUG oslo_concurrency.lockutils [req-1211fbcc-0959-49ec-9f3b-03ca928cbb2c req-94d7fd72-3009-4836-a737-5085ee4dfdab 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-9f228c07-c6ac-479c-9edb-ceebc19eac87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.708 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.709 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:29 compute-0 podman[202017]: time="2025-12-08T20:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:15:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23776 "" "Go-http-client/1.1"
Dec 08 20:15:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3922 "" "Go-http-client/1.1"
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.843 187791 DEBUG nova.compute.provider_tree [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.866 187791 DEBUG nova.scheduler.client.report [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.923 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.214s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:29 compute-0 nova_compute[187787]: 2025-12-08 20:15:29.963 187791 INFO nova.scheduler.client.report [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Deleted allocations for instance 9f228c07-c6ac-479c-9edb-ceebc19eac87
Dec 08 20:15:30 compute-0 nova_compute[187787]: 2025-12-08 20:15:30.049 187791 DEBUG oslo_concurrency.lockutils [None req-a609c8a3-8984-4143-9c16-b35e17ed63ab d1c8c9756a134cd7a38cb55743f12dad 73150461bb354f0fb8f4adf266d52ac8 - - default default] Lock "9f228c07-c6ac-479c-9edb-ceebc19eac87" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:30 compute-0 nova_compute[187787]: 2025-12-08 20:15:30.629 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: ERROR   20:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: ERROR   20:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: ERROR   20:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: ERROR   20:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: ERROR   20:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:15:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:15:31 compute-0 ovn_controller[96170]: 2025-12-08T20:15:31Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6c:cd:5c 10.100.0.4
Dec 08 20:15:31 compute-0 nova_compute[187787]: 2025-12-08 20:15:31.558 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:31 compute-0 nova_compute[187787]: 2025-12-08 20:15:31.640 187791 DEBUG nova.compute.manager [req-0c618b0c-fe69-43f8-b828-8d5d287f3db7 req-9f892c52-6713-4bdd-9f4f-6959f426293d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Received event network-vif-deleted-af16ddfd-01f8-4225-96a8-8ec9a5aa19ba external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:31 compute-0 nova_compute[187787]: 2025-12-08 20:15:31.640 187791 INFO nova.compute.manager [req-0c618b0c-fe69-43f8-b828-8d5d287f3db7 req-9f892c52-6713-4bdd-9f4f-6959f426293d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Neutron deleted interface af16ddfd-01f8-4225-96a8-8ec9a5aa19ba; detaching it from the instance and deleting it from the info cache
Dec 08 20:15:31 compute-0 nova_compute[187787]: 2025-12-08 20:15:31.641 187791 DEBUG nova.network.neutron [req-0c618b0c-fe69-43f8-b828-8d5d287f3db7 req-9f892c52-6713-4bdd-9f4f-6959f426293d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106
Dec 08 20:15:31 compute-0 nova_compute[187787]: 2025-12-08 20:15:31.643 187791 DEBUG nova.compute.manager [req-0c618b0c-fe69-43f8-b828-8d5d287f3db7 req-9f892c52-6713-4bdd-9f4f-6959f426293d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Detach interface failed, port_id=af16ddfd-01f8-4225-96a8-8ec9a5aa19ba, reason: Instance 9f228c07-c6ac-479c-9edb-ceebc19eac87 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Dec 08 20:15:33 compute-0 nova_compute[187787]: 2025-12-08 20:15:33.761 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:34 compute-0 sshd-session[215867]: Received disconnect from 45.174.162.68 port 4584:11: Bye Bye [preauth]
Dec 08 20:15:34 compute-0 sshd-session[215867]: Disconnected from authenticating user root 45.174.162.68 port 4584 [preauth]
Dec 08 20:15:35 compute-0 podman[215869]: 2025-12-08 20:15:35.490964971 +0000 UTC m=+0.065672213 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41)
Dec 08 20:15:35 compute-0 nova_compute[187787]: 2025-12-08 20:15:35.639 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:35 compute-0 nova_compute[187787]: 2025-12-08 20:15:35.643 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:36 compute-0 nova_compute[187787]: 2025-12-08 20:15:36.561 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:37 compute-0 podman[215892]: 2025-12-08 20:15:37.507028863 +0000 UTC m=+0.071948279 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Dec 08 20:15:37 compute-0 podman[215891]: 2025-12-08 20:15:37.511351899 +0000 UTC m=+0.087240227 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.642 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.944 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.945 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.945 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "31b38d28-b90e-434c-9967-912987aee08b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.945 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.946 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.947 187791 INFO nova.compute.manager [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Terminating instance
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.947 187791 DEBUG nova.compute.manager [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:15:40 compute-0 kernel: tap488a5725-c7 (unregistering): left promiscuous mode
Dec 08 20:15:40 compute-0 NetworkManager[56229]: <info>  [1765224940.9675] device (tap488a5725-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:15:40 compute-0 ovn_controller[96170]: 2025-12-08T20:15:40Z|00074|binding|INFO|Releasing lport 488a5725-c797-4165-b8ce-319c48f2e8b8 from this chassis (sb_readonly=0)
Dec 08 20:15:40 compute-0 ovn_controller[96170]: 2025-12-08T20:15:40Z|00075|binding|INFO|Setting lport 488a5725-c797-4165-b8ce-319c48f2e8b8 down in Southbound
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.972 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:40 compute-0 ovn_controller[96170]: 2025-12-08T20:15:40Z|00076|binding|INFO|Removing iface tap488a5725-c7 ovn-installed in OVS
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.975 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:40.982 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6c:cd:5c 10.100.0.4'], port_security=['fa:16:3e:6c:cd:5c 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '31b38d28-b90e-434c-9967-912987aee08b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ad1a4d6aebb84f6fb894551cd68d2ae1', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7d525409-d812-4dc1-bb50-a782007ffe4f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bcdae06f-5d1b-4090-b312-569e33431ebf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=488a5725-c797-4165-b8ce-319c48f2e8b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:15:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:40.984 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 488a5725-c797-4165-b8ce-319c48f2e8b8 in datapath f378b9ae-fe6a-498a-b0ea-0d98aea69001 unbound from our chassis
Dec 08 20:15:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:40.987 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f378b9ae-fe6a-498a-b0ea-0d98aea69001, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:15:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:40.988 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4ef1211c-0649-4b37-a7d9-597c7e178c22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:40 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:40.989 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 namespace which is not needed anymore
Dec 08 20:15:40 compute-0 nova_compute[187787]: 2025-12-08 20:15:40.990 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 08 20:15:41 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000001.scope: Consumed 12.283s CPU time.
Dec 08 20:15:41 compute-0 systemd-machined[154122]: Machine qemu-5-instance-00000001 terminated.
Dec 08 20:15:41 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [NOTICE]   (215738) : haproxy version is 2.8.14-c23fe91
Dec 08 20:15:41 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [NOTICE]   (215738) : path to executable is /usr/sbin/haproxy
Dec 08 20:15:41 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [WARNING]  (215738) : Exiting Master process...
Dec 08 20:15:41 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [ALERT]    (215738) : Current worker (215740) exited with code 143 (Terminated)
Dec 08 20:15:41 compute-0 neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001[215734]: [WARNING]  (215738) : All workers exited. Exiting... (0)
Dec 08 20:15:41 compute-0 systemd[1]: libpod-aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e.scope: Deactivated successfully.
Dec 08 20:15:41 compute-0 podman[215959]: 2025-12-08 20:15:41.118787326 +0000 UTC m=+0.048135875 container died aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 08 20:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e-userdata-shm.mount: Deactivated successfully.
Dec 08 20:15:41 compute-0 systemd[1]: var-lib-containers-storage-overlay-be619dd8d0b92127e41ca678434bbb33e9edcfb45ae64de413dbc8a4916efea6-merged.mount: Deactivated successfully.
Dec 08 20:15:41 compute-0 podman[215959]: 2025-12-08 20:15:41.152916503 +0000 UTC m=+0.082265042 container cleanup aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 08 20:15:41 compute-0 systemd[1]: libpod-conmon-aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e.scope: Deactivated successfully.
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.171 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.177 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.209 187791 INFO nova.virt.libvirt.driver [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Instance destroyed successfully.
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.210 187791 DEBUG nova.objects.instance [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lazy-loading 'resources' on Instance uuid 31b38d28-b90e-434c-9967-912987aee08b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:15:41 compute-0 podman[215986]: 2025-12-08 20:15:41.218034988 +0000 UTC m=+0.040746695 container remove aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.222 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[7ba019a9-c7fd-4499-bff4-2b82291007f4]: (4, ('Mon Dec  8 08:15:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 (aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e)\naa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e\nMon Dec  8 08:15:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 (aa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e)\naa8b39b81832f36a26b453ead0b304f0063ec019b4e5766b1d3ba96a5225df4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.224 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[753b43bb-04a0-4b8f-b690-98ff07c64566]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.225 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf378b9ae-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.226 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 kernel: tapf378b9ae-f0: left promiscuous mode
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.244 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.248 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[499aeb19-8716-4256-b420-8b38e3d93f6b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.266 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2b3755-60df-4b2b-8c4a-ecaa9a88b1ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.268 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0b003b-2e6e-420c-847a-c105c62f55ba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.284 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[e4bf16b2-d2e8-4f74-8a92-bf17dccff48f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345719, 'reachable_time': 30795, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216018, 'error': None, 'target': 'ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.288 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f378b9ae-fe6a-498a-b0ea-0d98aea69001 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:15:41 compute-0 systemd[1]: run-netns-ovnmeta\x2df378b9ae\x2dfe6a\x2d498a\x2db0ea\x2d0d98aea69001.mount: Deactivated successfully.
Dec 08 20:15:41 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:41.288 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[81fa8e7c-fc20-4d98-aade-43f55ebed480]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.370 187791 DEBUG nova.virt.libvirt.vif [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:14:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-662159316',display_name='tempest-ServerActionsTestJSON-server-662159316',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-662159316',id=1,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA0lmx240Myi6uyvHJAjl6OdYHzSJho9DIqF0f1bqWW8lbJ2EieN8cF8oR4Ivs97IM8rHwT/JRYR62Lhhu60wGctMY+Pf4FN5Y7bGT8qLOtA+UCE3QK9D+M+fl1vEmqInA==',key_name='tempest-keypair-940238471',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:14:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ad1a4d6aebb84f6fb894551cd68d2ae1',ramdisk_id='',reservation_id='r-02hyot02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1895647313',owner_user_name='tempest-ServerActionsTestJSON-1895647313-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:15:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='6e0c4248254a4bcb850e5443f0b8ad8b',uuid=31b38d28-b90e-434c-9967-912987aee08b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.371 187791 DEBUG nova.network.os_vif_util [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converting VIF {"id": "488a5725-c797-4165-b8ce-319c48f2e8b8", "address": "fa:16:3e:6c:cd:5c", "network": {"id": "f378b9ae-fe6a-498a-b0ea-0d98aea69001", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-2051911313-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ad1a4d6aebb84f6fb894551cd68d2ae1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap488a5725-c7", "ovs_interfaceid": "488a5725-c797-4165-b8ce-319c48f2e8b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.373 187791 DEBUG nova.network.os_vif_util [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.373 187791 DEBUG os_vif [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.375 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.376 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap488a5725-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.379 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.382 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.385 187791 INFO os_vif [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6c:cd:5c,bridge_name='br-int',has_traffic_filtering=True,id=488a5725-c797-4165-b8ce-319c48f2e8b8,network=Network(f378b9ae-fe6a-498a-b0ea-0d98aea69001),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap488a5725-c7')
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.386 187791 INFO nova.virt.libvirt.driver [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Deleting instance files /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b_del
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.387 187791 INFO nova.virt.libvirt.driver [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Deletion of /var/lib/nova/instances/31b38d28-b90e-434c-9967-912987aee08b_del complete
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.462 187791 INFO nova.compute.manager [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Took 0.51 seconds to destroy the instance on the hypervisor.
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.463 187791 DEBUG oslo.service.loopingcall [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.464 187791 DEBUG nova.compute.manager [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.464 187791 DEBUG nova.network.neutron [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.528 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765224926.5280762, 9f228c07-c6ac-479c-9edb-ceebc19eac87 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.529 187791 INFO nova.compute.manager [-] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] VM Stopped (Lifecycle Event)
Dec 08 20:15:41 compute-0 nova_compute[187787]: 2025-12-08 20:15:41.557 187791 DEBUG nova.compute.manager [None req-aac4ce36-4c5d-43e8-9fd2-dc93d68e741f - - - - - -] [instance: 9f228c07-c6ac-479c-9edb-ceebc19eac87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:42 compute-0 nova_compute[187787]: 2025-12-08 20:15:42.193 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.315 187791 DEBUG nova.network.neutron [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.356 187791 INFO nova.compute.manager [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Took 1.89 seconds to deallocate network for instance.
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.407 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.407 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.478 187791 DEBUG nova.compute.provider_tree [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.491 187791 DEBUG nova.compute.manager [req-8decb20a-5bcd-4b4f-bd7b-9949099024b9 req-28e087df-b857-42bc-9172-edc863ba8c9f 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Received event network-vif-deleted-488a5725-c797-4165-b8ce-319c48f2e8b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.498 187791 DEBUG nova.scheduler.client.report [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.541 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.588 187791 INFO nova.scheduler.client.report [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Deleted allocations for instance 31b38d28-b90e-434c-9967-912987aee08b
Dec 08 20:15:43 compute-0 nova_compute[187787]: 2025-12-08 20:15:43.723 187791 DEBUG oslo_concurrency.lockutils [None req-f7f12d2b-29cc-476d-95a6-b4fe62f6f81c 6e0c4248254a4bcb850e5443f0b8ad8b ad1a4d6aebb84f6fb894551cd68d2ae1 - - default default] Lock "31b38d28-b90e-434c-9967-912987aee08b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:45 compute-0 nova_compute[187787]: 2025-12-08 20:15:45.487 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:45 compute-0 nova_compute[187787]: 2025-12-08 20:15:45.712 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:45 compute-0 nova_compute[187787]: 2025-12-08 20:15:45.724 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:46 compute-0 nova_compute[187787]: 2025-12-08 20:15:46.378 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:46 compute-0 podman[216021]: 2025-12-08 20:15:46.530697876 +0000 UTC m=+0.086576666 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 08 20:15:46 compute-0 podman[216020]: 2025-12-08 20:15:46.552967202 +0000 UTC m=+0.113406934 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:15:49 compute-0 podman[216068]: 2025-12-08 20:15:49.482048004 +0000 UTC m=+0.053900495 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:15:50 compute-0 nova_compute[187787]: 2025-12-08 20:15:50.716 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:51 compute-0 nova_compute[187787]: 2025-12-08 20:15:51.381 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:52 compute-0 nova_compute[187787]: 2025-12-08 20:15:52.163 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:52.164 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:15:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:52.165 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:15:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:54.989 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:15:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:54.990 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:15:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:54.990 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:15:55 compute-0 nova_compute[187787]: 2025-12-08 20:15:55.719 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:56 compute-0 nova_compute[187787]: 2025-12-08 20:15:56.208 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765224941.2077975, 31b38d28-b90e-434c-9967-912987aee08b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:15:56 compute-0 nova_compute[187787]: 2025-12-08 20:15:56.209 187791 INFO nova.compute.manager [-] [instance: 31b38d28-b90e-434c-9967-912987aee08b] VM Stopped (Lifecycle Event)
Dec 08 20:15:56 compute-0 nova_compute[187787]: 2025-12-08 20:15:56.239 187791 DEBUG nova.compute.manager [None req-b4cf8dbd-4b02-4305-a4b1-50a9af88a241 - - - - - -] [instance: 31b38d28-b90e-434c-9967-912987aee08b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:15:56 compute-0 nova_compute[187787]: 2025-12-08 20:15:56.382 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:15:57 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:15:57.168 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:15:59 compute-0 podman[216093]: 2025-12-08 20:15:59.521680227 +0000 UTC m=+0.088062485 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 08 20:15:59 compute-0 podman[202017]: time="2025-12-08T20:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:15:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:15:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Dec 08 20:16:00 compute-0 nova_compute[187787]: 2025-12-08 20:16:00.721 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:01 compute-0 nova_compute[187787]: 2025-12-08 20:16:01.384 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: ERROR   20:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: ERROR   20:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: ERROR   20:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: ERROR   20:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: ERROR   20:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:16:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:16:05 compute-0 nova_compute[187787]: 2025-12-08 20:16:05.723 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.285 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.286 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.310 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.386 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.414 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.415 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.422 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.422 187791 INFO nova.compute.claims [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:16:06 compute-0 podman[216113]: 2025-12-08 20:16:06.483710551 +0000 UTC m=+0.054587421 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.818 187791 DEBUG nova.compute.provider_tree [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.869 187791 DEBUG nova.scheduler.client.report [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.917 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.918 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.979 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:16:06 compute-0 nova_compute[187787]: 2025-12-08 20:16:06.980 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.018 187791 INFO nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.050 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.156 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.157 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.157 187791 INFO nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Creating image(s)
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.158 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.158 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.159 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.171 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.228 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.229 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.230 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.241 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.297 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.298 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.315 187791 DEBUG nova.policy [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9b86c7d5501a42d0bc8d49585ff3a697', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '67a776abf9054fd3b9fd5701a5c2a131', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.327 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk 1073741824" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.328 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.328 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.410 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.411 187791 DEBUG nova.virt.disk.api [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Checking if we can resize image /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.411 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.503 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.505 187791 DEBUG nova.virt.disk.api [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Cannot resize image /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.506 187791 DEBUG nova.objects.instance [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lazy-loading 'migration_context' on Instance uuid 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.533 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.533 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Ensure instance console log exists: /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.534 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.535 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:07 compute-0 nova_compute[187787]: 2025-12-08 20:16:07.536 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:08 compute-0 podman[216151]: 2025-12-08 20:16:08.479643617 +0000 UTC m=+0.047875248 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:16:08 compute-0 podman[216152]: 2025-12-08 20:16:08.493779892 +0000 UTC m=+0.056331865 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 08 20:16:08 compute-0 nova_compute[187787]: 2025-12-08 20:16:08.780 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Successfully created port: fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.554 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Successfully updated port: fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.571 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.571 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquired lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.572 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.672 187791 DEBUG nova.compute.manager [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-changed-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.673 187791 DEBUG nova.compute.manager [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Refreshing instance network info cache due to event network-changed-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.673 187791 DEBUG oslo_concurrency.lockutils [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.725 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:10 compute-0 nova_compute[187787]: 2025-12-08 20:16:10.805 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:16:11 compute-0 nova_compute[187787]: 2025-12-08 20:16:11.387 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:11 compute-0 sshd-session[216193]: Invalid user ubuntu from 103.172.28.62 port 37004
Dec 08 20:16:12 compute-0 sshd-session[216193]: Received disconnect from 103.172.28.62 port 37004:11: Bye Bye [preauth]
Dec 08 20:16:12 compute-0 sshd-session[216193]: Disconnected from invalid user ubuntu 103.172.28.62 port 37004 [preauth]
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.734 187791 DEBUG nova.network.neutron [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updating instance_info_cache with network_info: [{"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.756 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Releasing lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.757 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Instance network_info: |[{"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.757 187791 DEBUG oslo_concurrency.lockutils [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.757 187791 DEBUG nova.network.neutron [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Refreshing network info cache for port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.761 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Start _get_guest_xml network_info=[{"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.766 187791 WARNING nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.773 187791 DEBUG nova.virt.libvirt.host [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.774 187791 DEBUG nova.virt.libvirt.host [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.778 187791 DEBUG nova.virt.libvirt.host [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.779 187791 DEBUG nova.virt.libvirt.host [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.779 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.779 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.780 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.780 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.781 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.781 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.781 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.781 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.782 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.782 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.782 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.782 187791 DEBUG nova.virt.hardware [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.787 187791 DEBUG nova.virt.libvirt.vif [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124047142',display_name='tempest-TestServerBasicOps-server-124047142',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124047142',id=5,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQSZtyugIOV6BIaMv5PCdQ3D3cXktHKxGmMONWzOHwlzVu6FkI1m7OO549cPk49Q2U1ZPrlDAE5eikWrXfTWD3qAFA2v5gaa3i0yzRmMrrvetYGsXNRGfH3TBzcoY+1/Q==',key_name='tempest-TestServerBasicOps-1490320690',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67a776abf9054fd3b9fd5701a5c2a131',ramdisk_id='',reservation_id='r-a72qcizu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-630825278',owner_user_name='tempest-TestServerBasicOps-630825278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:16:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9b86c7d5501a42d0bc8d49585ff3a697',uuid=4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.787 187791 DEBUG nova.network.os_vif_util [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converting VIF {"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.788 187791 DEBUG nova.network.os_vif_util [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.789 187791 DEBUG nova.objects.instance [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.806 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <uuid>4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6</uuid>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <name>instance-00000005</name>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:name>tempest-TestServerBasicOps-server-124047142</nova:name>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:16:13</nova:creationTime>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:user uuid="9b86c7d5501a42d0bc8d49585ff3a697">tempest-TestServerBasicOps-630825278-project-member</nova:user>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:project uuid="67a776abf9054fd3b9fd5701a5c2a131">tempest-TestServerBasicOps-630825278</nova:project>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         <nova:port uuid="fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad">
Dec 08 20:16:13 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <system>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="serial">4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="uuid">4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </system>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <os>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </os>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <features>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </features>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.config"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:0e:a1:23"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <target dev="tapfe4ee820-4e"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/console.log" append="off"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <video>
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </video>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:16:13 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:16:13 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:16:13 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:16:13 compute-0 nova_compute[187787]: </domain>
Dec 08 20:16:13 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.806 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Preparing to wait for external event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.807 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.807 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.807 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.808 187791 DEBUG nova.virt.libvirt.vif [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124047142',display_name='tempest-TestServerBasicOps-server-124047142',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124047142',id=5,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQSZtyugIOV6BIaMv5PCdQ3D3cXktHKxGmMONWzOHwlzVu6FkI1m7OO549cPk49Q2U1ZPrlDAE5eikWrXfTWD3qAFA2v5gaa3i0yzRmMrrvetYGsXNRGfH3TBzcoY+1/Q==',key_name='tempest-TestServerBasicOps-1490320690',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='67a776abf9054fd3b9fd5701a5c2a131',ramdisk_id='',reservation_id='r-a72qcizu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-630825278',owner_user_name='tempest-TestServerBasicOps-630825278-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:16:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9b86c7d5501a42d0bc8d49585ff3a697',uuid=4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.808 187791 DEBUG nova.network.os_vif_util [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converting VIF {"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.809 187791 DEBUG nova.network.os_vif_util [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.809 187791 DEBUG os_vif [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.809 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.810 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.810 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.812 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.812 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe4ee820-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.813 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe4ee820-4e, col_values=(('external_ids', {'iface-id': 'fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:a1:23', 'vm-uuid': '4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.815 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:13 compute-0 NetworkManager[56229]: <info>  [1765224973.8168] manager: (tapfe4ee820-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.817 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.822 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.823 187791 INFO os_vif [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e')
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.927 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.928 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.929 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] No VIF found with MAC fa:16:3e:0e:a1:23, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:16:13 compute-0 nova_compute[187787]: 2025-12-08 20:16:13.930 187791 INFO nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Using config drive
Dec 08 20:16:14 compute-0 nova_compute[187787]: 2025-12-08 20:16:14.883 187791 INFO nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Creating config drive at /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.config
Dec 08 20:16:14 compute-0 nova_compute[187787]: 2025-12-08 20:16:14.889 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qccspgl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.014 187791 DEBUG oslo_concurrency.processutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6qccspgl" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:15 compute-0 kernel: tapfe4ee820-4e: entered promiscuous mode
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.0859] manager: (tapfe4ee820-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.086 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 ovn_controller[96170]: 2025-12-08T20:16:15Z|00077|binding|INFO|Claiming lport fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad for this chassis.
Dec 08 20:16:15 compute-0 ovn_controller[96170]: 2025-12-08T20:16:15Z|00078|binding|INFO|fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad: Claiming fa:16:3e:0e:a1:23 10.100.0.10
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.089 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 systemd-udevd[216213]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.1284] device (tapfe4ee820-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.1289] device (tapfe4ee820-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:16:15 compute-0 systemd-machined[154122]: New machine qemu-6-instance-00000005.
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.141 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:a1:23 10.100.0.10'], port_security=['fa:16:3e:0e:a1:23 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00649599-f843-461b-8984-ef8b1c5591f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67a776abf9054fd3b9fd5701a5c2a131', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a3601c45-bd2b-45f5-b04a-8b3857fccd88 f8a9470e-b0c0-4d28-a3b4-b8e2afe7ce40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7820fe20-07cc-462b-bc6d-c654b1086d88, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.142 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.144 105024 INFO neutron.agent.ovn.metadata.agent [-] Port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad in datapath 00649599-f843-461b-8984-ef8b1c5591f5 bound to our chassis
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.146 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00649599-f843-461b-8984-ef8b1c5591f5
Dec 08 20:16:15 compute-0 ovn_controller[96170]: 2025-12-08T20:16:15Z|00079|binding|INFO|Setting lport fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad ovn-installed in OVS
Dec 08 20:16:15 compute-0 ovn_controller[96170]: 2025-12-08T20:16:15Z|00080|binding|INFO|Setting lport fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad up in Southbound
Dec 08 20:16:15 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000005.
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.150 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.159 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0b025f81-88f5-4bfd-8bfe-e7f6052fa76f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.160 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00649599-f1 in ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.162 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00649599-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.163 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[8a584ef2-b162-41cb-9905-0e337ce0f3fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.164 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[7483ccb5-fc74-4934-aeac-9f787b60652c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.176 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d6207f-68b7-4a36-931a-118d5ffdf7e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.199 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[96a3083f-0a84-47f4-bef8-8bcd895ff990]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.224 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[1462fbdc-1d9b-4863-b80b-e87491589c5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.229 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[6230d8cf-2446-4375-9a31-865c002e9b85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.2307] manager: (tap00649599-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.254 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[11fcebda-9eee-48e5-8f7b-aea302389c9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.256 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[ac067128-9803-4723-a150-a662d31b4f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.2768] device (tap00649599-f0): carrier: link connected
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.281 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[86723635-79d2-4a03-a41d-ea0cf17e5acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.301 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[374a3cea-25bf-4c9a-a340-3903645ff462]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00649599-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:f1:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351287, 'reachable_time': 40380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216249, 'error': None, 'target': 'ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.315 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[c899842a-794e-4a01-ba3f-a0af0da17464]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe99:f1e2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 351287, 'tstamp': 351287}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216250, 'error': None, 'target': 'ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.332 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[8c91727c-7441-4897-973c-415348144c66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00649599-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:99:f1:e2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351287, 'reachable_time': 40380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216251, 'error': None, 'target': 'ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.360 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b75fcc37-f710-4d7d-9557-bed6578b8f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.416 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2dc6924d-ebf6-446d-a7dd-718287770621]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.417 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00649599-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.418 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.418 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00649599-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.420 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 kernel: tap00649599-f0: entered promiscuous mode
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.422 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.423 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00649599-f0, col_values=(('external_ids', {'iface-id': '1c8adead-c286-4358-bc66-014ea62fec5c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:15 compute-0 NetworkManager[56229]: <info>  [1765224975.4240] manager: (tap00649599-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 08 20:16:15 compute-0 ovn_controller[96170]: 2025-12-08T20:16:15Z|00081|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.424 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.435 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.436 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00649599-f843-461b-8984-ef8b1c5591f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00649599-f843-461b-8984-ef8b1c5591f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.437 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[13a5ff35-b6d4-4860-9d74-2d79b56d3caf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.438 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-00649599-f843-461b-8984-ef8b1c5591f5
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/00649599-f843-461b-8984-ef8b1c5591f5.pid.haproxy
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID 00649599-f843-461b-8984-ef8b1c5591f5
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:16:15 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:15.438 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5', 'env', 'PROCESS_TAG=haproxy-00649599-f843-461b-8984-ef8b1c5591f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00649599-f843-461b-8984-ef8b1c5591f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:16:15 compute-0 nova_compute[187787]: 2025-12-08 20:16:15.727 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:15 compute-0 podman[216283]: 2025-12-08 20:16:15.792662652 +0000 UTC m=+0.049816829 container create 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 08 20:16:15 compute-0 systemd[1]: Started libpod-conmon-8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2.scope.
Dec 08 20:16:15 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:16:15 compute-0 podman[216283]: 2025-12-08 20:16:15.766636312 +0000 UTC m=+0.023790509 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:16:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7bb8229d5f2629b09ed2a4be4b50afda9e1efcffe0761f51f2c2ced345198b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:16:15 compute-0 podman[216283]: 2025-12-08 20:16:15.87803249 +0000 UTC m=+0.135186687 container init 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:16:15 compute-0 podman[216283]: 2025-12-08 20:16:15.885122073 +0000 UTC m=+0.142276250 container start 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:16:15 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [NOTICE]   (216302) : New worker (216304) forked
Dec 08 20:16:15 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [NOTICE]   (216302) : Loading success.
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.370 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224976.3693774, 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.370 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] VM Started (Lifecycle Event)
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.396 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.400 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224976.3695269, 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.401 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] VM Paused (Lifecycle Event)
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.424 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.428 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:16:16 compute-0 nova_compute[187787]: 2025-12-08 20:16:16.449 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:16:16 compute-0 sshd[129409]: Timeout before authentication for connection from 222.172.32.246 to 38.102.83.66, pid = 214360
Dec 08 20:16:17 compute-0 podman[216321]: 2025-12-08 20:16:17.010884291 +0000 UTC m=+0.059415632 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 08 20:16:17 compute-0 podman[216320]: 2025-12-08 20:16:17.030501828 +0000 UTC m=+0.079140783 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:16:17 compute-0 nova_compute[187787]: 2025-12-08 20:16:17.901 187791 DEBUG nova.network.neutron [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updated VIF entry in instance network info cache for port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:16:17 compute-0 nova_compute[187787]: 2025-12-08 20:16:17.901 187791 DEBUG nova.network.neutron [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updating instance_info_cache with network_info: [{"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:17 compute-0 nova_compute[187787]: 2025-12-08 20:16:17.921 187791 DEBUG oslo_concurrency.lockutils [req-5f123375-f9d3-41c7-ba7a-3f98d9474d7a req-5e0ade6d-e062-4f02-b79e-76e304034758 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:16:18 compute-0 nova_compute[187787]: 2025-12-08 20:16:18.816 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:19 compute-0 nova_compute[187787]: 2025-12-08 20:16:19.671 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:20 compute-0 podman[216365]: 2025-12-08 20:16:20.495889544 +0000 UTC m=+0.057937845 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:16:20 compute-0 nova_compute[187787]: 2025-12-08 20:16:20.728 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:20 compute-0 nova_compute[187787]: 2025-12-08 20:16:20.774 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:20 compute-0 nova_compute[187787]: 2025-12-08 20:16:20.801 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:20 compute-0 nova_compute[187787]: 2025-12-08 20:16:20.801 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.764 187791 DEBUG nova.compute.manager [req-192e7160-c3a3-4929-8a7d-6ab81e67b246 req-c9a1e9d8-a528-467d-a161-0dde4557e96c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.765 187791 DEBUG oslo_concurrency.lockutils [req-192e7160-c3a3-4929-8a7d-6ab81e67b246 req-c9a1e9d8-a528-467d-a161-0dde4557e96c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.765 187791 DEBUG oslo_concurrency.lockutils [req-192e7160-c3a3-4929-8a7d-6ab81e67b246 req-c9a1e9d8-a528-467d-a161-0dde4557e96c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.765 187791 DEBUG oslo_concurrency.lockutils [req-192e7160-c3a3-4929-8a7d-6ab81e67b246 req-c9a1e9d8-a528-467d-a161-0dde4557e96c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.766 187791 DEBUG nova.compute.manager [req-192e7160-c3a3-4929-8a7d-6ab81e67b246 req-c9a1e9d8-a528-467d-a161-0dde4557e96c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Processing event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.766 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.770 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224981.7706275, 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.771 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] VM Resumed (Lifecycle Event)
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.772 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.776 187791 INFO nova.virt.libvirt.driver [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Instance spawned successfully.
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.776 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.816 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.819 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.820 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.820 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.820 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.821 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.821 187791 DEBUG nova.virt.libvirt.driver [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.825 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.864 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.865 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.865 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.873 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.910 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.911 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.911 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.911 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.927 187791 INFO nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Took 14.77 seconds to spawn the instance on the hypervisor.
Dec 08 20:16:21 compute-0 nova_compute[187787]: 2025-12-08 20:16:21.928 187791 DEBUG nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.034 187791 INFO nova.compute.manager [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Took 15.66 seconds to build instance.
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.037 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.062 187791 DEBUG oslo_concurrency.lockutils [None req-e7f51248-089d-4cc4-9c1b-0ecdbd374828 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.776s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.100 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.101 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.153 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.310 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.311 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.328 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.364 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.365 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5621MB free_disk=72.88029861450195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.365 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.366 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.414 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.464 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Instance 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.486 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Instance cb7797d1-3f0d-4927-8b0d-b6e999785b82 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.486 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.486 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.587 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.610 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.637 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.638 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.638 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.645 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.646 187791 INFO nova.compute.claims [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.818 187791 DEBUG nova.compute.provider_tree [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.841 187791 DEBUG nova.scheduler.client.report [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.867 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.868 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.917 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.918 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.941 187791 INFO nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:16:22 compute-0 nova_compute[187787]: 2025-12-08 20:16:22.964 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.088 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.090 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.090 187791 INFO nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Creating image(s)
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.091 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.092 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.093 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.118 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.206 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.208 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.208 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.220 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.261 187791 DEBUG nova.policy [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4dcbe1bf6044eaf802a21c96f112cd7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64781c8586b847fb8240984ca5fbf6fa', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.296 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.298 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.348 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.351 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.354 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.437 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.439 187791 DEBUG nova.virt.disk.api [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Checking if we can resize image /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.440 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.504 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.505 187791 DEBUG nova.virt.disk.api [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Cannot resize image /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.506 187791 DEBUG nova.objects.instance [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lazy-loading 'migration_context' on Instance uuid cb7797d1-3f0d-4927-8b0d-b6e999785b82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.535 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.536 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Ensure instance console log exists: /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.537 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.538 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.539 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.553 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.554 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.554 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:23 compute-0 nova_compute[187787]: 2025-12-08 20:16:23.820 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.003 187791 DEBUG nova.compute.manager [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.004 187791 DEBUG oslo_concurrency.lockutils [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.004 187791 DEBUG oslo_concurrency.lockutils [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.004 187791 DEBUG oslo_concurrency.lockutils [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.004 187791 DEBUG nova.compute.manager [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] No waiting events found dispatching network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.004 187791 WARNING nova.compute.manager [req-834631a3-ff4b-4d97-a149-5390c99bc87e req-67b4bbd0-69b5-474f-8ad6-a4b6a69cd228 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received unexpected event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad for instance with vm_state active and task_state None.
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:16:24 compute-0 nova_compute[187787]: 2025-12-08 20:16:24.873 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Successfully created port: 820863a3-a6c4-4b66-962b-a1dd233d0420 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:16:25 compute-0 nova_compute[187787]: 2025-12-08 20:16:25.731 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:27 compute-0 nova_compute[187787]: 2025-12-08 20:16:27.591 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Successfully updated port: 820863a3-a6c4-4b66-962b-a1dd233d0420 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:16:27 compute-0 nova_compute[187787]: 2025-12-08 20:16:27.638 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:16:27 compute-0 nova_compute[187787]: 2025-12-08 20:16:27.639 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquired lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:16:27 compute-0 nova_compute[187787]: 2025-12-08 20:16:27.639 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.288 187791 DEBUG nova.compute.manager [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-changed-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.289 187791 DEBUG nova.compute.manager [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Refreshing instance network info cache due to event network-changed-820863a3-a6c4-4b66-962b-a1dd233d0420. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.289 187791 DEBUG oslo_concurrency.lockutils [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.340 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:28 compute-0 NetworkManager[56229]: <info>  [1765224988.3611] manager: (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Dec 08 20:16:28 compute-0 NetworkManager[56229]: <info>  [1765224988.3623] manager: (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 08 20:16:28 compute-0 ovn_controller[96170]: 2025-12-08T20:16:28Z|00082|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:28 compute-0 ovn_controller[96170]: 2025-12-08T20:16:28Z|00083|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.553 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:28 compute-0 nova_compute[187787]: 2025-12-08 20:16:28.822 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:29 compute-0 nova_compute[187787]: 2025-12-08 20:16:29.049 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:16:29 compute-0 podman[202017]: time="2025-12-08T20:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:16:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23776 "" "Go-http-client/1.1"
Dec 08 20:16:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3919 "" "Go-http-client/1.1"
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.420 187791 DEBUG nova.compute.manager [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-changed-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.421 187791 DEBUG nova.compute.manager [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Refreshing instance network info cache due to event network-changed-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.422 187791 DEBUG oslo_concurrency.lockutils [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.422 187791 DEBUG oslo_concurrency.lockutils [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.423 187791 DEBUG nova.network.neutron [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Refreshing network info cache for port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:16:30 compute-0 podman[216416]: 2025-12-08 20:16:30.494880972 +0000 UTC m=+0.062882700 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Dec 08 20:16:30 compute-0 nova_compute[187787]: 2025-12-08 20:16:30.736 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.215 187791 DEBUG nova.network.neutron [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Updating instance_info_cache with network_info: [{"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.243 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Releasing lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.243 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Instance network_info: |[{"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.244 187791 DEBUG oslo_concurrency.lockutils [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.244 187791 DEBUG nova.network.neutron [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Refreshing network info cache for port 820863a3-a6c4-4b66-962b-a1dd233d0420 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.247 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Start _get_guest_xml network_info=[{"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.253 187791 WARNING nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.262 187791 DEBUG nova.virt.libvirt.host [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.264 187791 DEBUG nova.virt.libvirt.host [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.276 187791 DEBUG nova.virt.libvirt.host [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.277 187791 DEBUG nova.virt.libvirt.host [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.278 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.278 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.279 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.279 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.279 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.279 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.280 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.280 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.281 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.281 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.281 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.281 187791 DEBUG nova.virt.hardware [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.285 187791 DEBUG nova.virt.libvirt.vif [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:16:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-547578929',display_name='tempest-ServerAddressesTestJSON-server-547578929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-547578929',id=6,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64781c8586b847fb8240984ca5fbf6fa',ramdisk_id='',reservation_id='r-6x7gej7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1485290880',owner_user_name='tempest-ServerAddressesTestJSON-1485290880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:16:23Z,user_data=None,user_id='a4dcbe1bf6044eaf802a21c96f112cd7',uuid=cb7797d1-3f0d-4927-8b0d-b6e999785b82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.286 187791 DEBUG nova.network.os_vif_util [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converting VIF {"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.286 187791 DEBUG nova.network.os_vif_util [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.287 187791 DEBUG nova.objects.instance [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lazy-loading 'pci_devices' on Instance uuid cb7797d1-3f0d-4927-8b0d-b6e999785b82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.305 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <uuid>cb7797d1-3f0d-4927-8b0d-b6e999785b82</uuid>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <name>instance-00000006</name>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:name>tempest-ServerAddressesTestJSON-server-547578929</nova:name>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:16:31</nova:creationTime>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:user uuid="a4dcbe1bf6044eaf802a21c96f112cd7">tempest-ServerAddressesTestJSON-1485290880-project-member</nova:user>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:project uuid="64781c8586b847fb8240984ca5fbf6fa">tempest-ServerAddressesTestJSON-1485290880</nova:project>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         <nova:port uuid="820863a3-a6c4-4b66-962b-a1dd233d0420">
Dec 08 20:16:31 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <system>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="serial">cb7797d1-3f0d-4927-8b0d-b6e999785b82</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="uuid">cb7797d1-3f0d-4927-8b0d-b6e999785b82</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </system>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <os>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </os>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <features>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </features>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.config"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:17:99:45"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <target dev="tap820863a3-a6"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/console.log" append="off"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <video>
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </video>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:16:31 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:16:31 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:16:31 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:16:31 compute-0 nova_compute[187787]: </domain>
Dec 08 20:16:31 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.311 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Preparing to wait for external event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.311 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.312 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.312 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.313 187791 DEBUG nova.virt.libvirt.vif [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:16:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-547578929',display_name='tempest-ServerAddressesTestJSON-server-547578929',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-547578929',id=6,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64781c8586b847fb8240984ca5fbf6fa',ramdisk_id='',reservation_id='r-6x7gej7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-1485290880',owner_user_name='tempest-ServerAddressesTestJSON-1485290880-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:16:23Z,user_data=None,user_id='a4dcbe1bf6044eaf802a21c96f112cd7',uuid=cb7797d1-3f0d-4927-8b0d-b6e999785b82,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.314 187791 DEBUG nova.network.os_vif_util [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converting VIF {"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.314 187791 DEBUG nova.network.os_vif_util [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.315 187791 DEBUG os_vif [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.316 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.316 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.316 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.321 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.321 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap820863a3-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.322 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap820863a3-a6, col_values=(('external_ids', {'iface-id': '820863a3-a6c4-4b66-962b-a1dd233d0420', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:99:45', 'vm-uuid': 'cb7797d1-3f0d-4927-8b0d-b6e999785b82'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.324 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:31 compute-0 NetworkManager[56229]: <info>  [1765224991.3257] manager: (tap820863a3-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.328 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.338 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.340 187791 INFO os_vif [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6')
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.401 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.402 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.402 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] No VIF found with MAC fa:16:3e:17:99:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.403 187791 INFO nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Using config drive
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: ERROR   20:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: ERROR   20:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: ERROR   20:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: ERROR   20:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: ERROR   20:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:16:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.951 187791 INFO nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Creating config drive at /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.config
Dec 08 20:16:31 compute-0 nova_compute[187787]: 2025-12-08 20:16:31.958 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0yly0w6_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.087 187791 DEBUG oslo_concurrency.processutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0yly0w6_" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:16:32 compute-0 kernel: tap820863a3-a6: entered promiscuous mode
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.1651] manager: (tap820863a3-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Dec 08 20:16:32 compute-0 ovn_controller[96170]: 2025-12-08T20:16:32Z|00084|binding|INFO|Claiming lport 820863a3-a6c4-4b66-962b-a1dd233d0420 for this chassis.
Dec 08 20:16:32 compute-0 ovn_controller[96170]: 2025-12-08T20:16:32Z|00085|binding|INFO|820863a3-a6c4-4b66-962b-a1dd233d0420: Claiming fa:16:3e:17:99:45 10.100.0.11
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.171 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.191 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:99:45 10.100.0.11'], port_security=['fa:16:3e:17:99:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cb7797d1-3f0d-4927-8b0d-b6e999785b82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6265520-f130-4200-ac6a-92b78769cf12', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64781c8586b847fb8240984ca5fbf6fa', 'neutron:revision_number': '2', 'neutron:security_group_ids': '64456497-54bc-4a8a-b34b-88ab884bdf50', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b59c7489-ca7b-4a07-878a-369e272a2b1c, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=820863a3-a6c4-4b66-962b-a1dd233d0420) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:16:32 compute-0 ovn_controller[96170]: 2025-12-08T20:16:32Z|00086|binding|INFO|Setting lport 820863a3-a6c4-4b66-962b-a1dd233d0420 ovn-installed in OVS
Dec 08 20:16:32 compute-0 ovn_controller[96170]: 2025-12-08T20:16:32Z|00087|binding|INFO|Setting lport 820863a3-a6c4-4b66-962b-a1dd233d0420 up in Southbound
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.196 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 820863a3-a6c4-4b66-962b-a1dd233d0420 in datapath f6265520-f130-4200-ac6a-92b78769cf12 bound to our chassis
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.200 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f6265520-f130-4200-ac6a-92b78769cf12
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.212 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0677bf-c192-4460-a016-a36a50d8140b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.213 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf6265520-f1 in ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.218 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf6265520-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.218 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[66cd5070-fdd8-4133-ab07-8b07db01a0c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.220 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[ad3abdb1-9d84-4e1b-8ea1-097f3db747a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 systemd-machined[154122]: New machine qemu-7-instance-00000006.
Dec 08 20:16:32 compute-0 systemd-udevd[216460]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.2439] device (tap820863a3-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.2452] device (tap820863a3-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:16:32 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-00000006.
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.250 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[220d3224-221c-489d-a7b1-0169b89889b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.272 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f63795e6-1cb6-4fa1-a78f-99184b55f457]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.306 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[2e60acc0-efb7-40ab-980f-1654a0a1d2ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.3129] manager: (tapf6265520-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/51)
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.311 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[74038b3b-9016-4e4d-ad73-7464b5d09b0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.346 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[2533577c-53ee-4cac-95b6-f22ed2edf106]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.350 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[4188b2ad-69c5-4b27-8806-fd1d727c5909]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.3778] device (tapf6265520-f0): carrier: link connected
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.384 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[208d61e2-58eb-40cb-8775-94e450d17bd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.403 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[c16ea361-6f6d-4e9b-8de1-e49ec2f20cb8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6265520-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:3b:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352997, 'reachable_time': 34929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216492, 'error': None, 'target': 'ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.422 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3d3af0-d658-43cd-a74a-150db94ace18]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef8:3b98'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 352997, 'tstamp': 352997}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216493, 'error': None, 'target': 'ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.441 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[09956e10-d0c2-4060-8f99-3e85735fd8a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf6265520-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:3b:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352997, 'reachable_time': 34929, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216494, 'error': None, 'target': 'ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.483 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4837c5a5-cdc9-469c-a2b1-1492e55a2cba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.556 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[731dc19d-add5-44a5-9786-0d06c692e34b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.558 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6265520-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.559 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.560 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf6265520-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.562 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:32 compute-0 NetworkManager[56229]: <info>  [1765224992.5633] manager: (tapf6265520-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/52)
Dec 08 20:16:32 compute-0 kernel: tapf6265520-f0: entered promiscuous mode
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.565 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf6265520-f0, col_values=(('external_ids', {'iface-id': '18fa5a70-0bbf-4a90-b824-a803b1e1bef1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.566 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:32 compute-0 ovn_controller[96170]: 2025-12-08T20:16:32Z|00088|binding|INFO|Releasing lport 18fa5a70-0bbf-4a90-b824-a803b1e1bef1 from this chassis (sb_readonly=0)
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.578 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.578 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f6265520-f130-4200-ac6a-92b78769cf12.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f6265520-f130-4200-ac6a-92b78769cf12.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.579 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.579 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[cad31d84-4f35-4190-a59c-0d42beef0ae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.580 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-f6265520-f130-4200-ac6a-92b78769cf12
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/f6265520-f130-4200-ac6a-92b78769cf12.pid.haproxy
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID f6265520-f130-4200-ac6a-92b78769cf12
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:16:32 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:32.581 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12', 'env', 'PROCESS_TAG=haproxy-f6265520-f130-4200-ac6a-92b78769cf12', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f6265520-f130-4200-ac6a-92b78769cf12.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.720 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224992.7202299, cb7797d1-3f0d-4927-8b0d-b6e999785b82 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.721 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] VM Started (Lifecycle Event)
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.742 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.746 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224992.7230563, cb7797d1-3f0d-4927-8b0d-b6e999785b82 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.747 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] VM Paused (Lifecycle Event)
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.767 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.771 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.807 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.971 187791 DEBUG nova.network.neutron [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updated VIF entry in instance network info cache for port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.971 187791 DEBUG nova.network.neutron [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updating instance_info_cache with network_info: [{"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.989 187791 DEBUG nova.network.neutron [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Updated VIF entry in instance network info cache for port 820863a3-a6c4-4b66-962b-a1dd233d0420. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:16:32 compute-0 nova_compute[187787]: 2025-12-08 20:16:32.990 187791 DEBUG nova.network.neutron [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Updating instance_info_cache with network_info: [{"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.018 187791 DEBUG oslo_concurrency.lockutils [req-811bf8a8-0889-4406-be6a-f5e0af2fa5b3 req-b5fc58d1-8ec0-465e-ab4f-050bafe877a3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-cb7797d1-3f0d-4927-8b0d-b6e999785b82" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.020 187791 DEBUG oslo_concurrency.lockutils [req-c2c7d1ca-376c-47c3-8809-7df223a8ea3d req-9e0f65ec-dfa1-4236-aea5-f8857d2b0fc5 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:16:33 compute-0 podman[216540]: 2025-12-08 20:16:33.020984232 +0000 UTC m=+0.057711968 container create bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 08 20:16:33 compute-0 systemd[1]: Started libpod-conmon-bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909.scope.
Dec 08 20:16:33 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:16:33 compute-0 podman[216540]: 2025-12-08 20:16:32.988898751 +0000 UTC m=+0.025626507 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:16:33 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e076e20911ae3c3cc93dbe2afaa9bb38d2e65e535f0c3704f4b94725b5fa9d3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:16:33 compute-0 podman[216540]: 2025-12-08 20:16:33.106409431 +0000 UTC m=+0.143137167 container init bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:16:33 compute-0 podman[216540]: 2025-12-08 20:16:33.112218704 +0000 UTC m=+0.148946440 container start bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 08 20:16:33 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [NOTICE]   (216567) : New worker (216569) forked
Dec 08 20:16:33 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [NOTICE]   (216567) : Loading success.
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.247 187791 DEBUG nova.compute.manager [req-ce1b3c1b-f49c-4d61-b869-ae6de1e27103 req-5c7de652-bee2-4455-8b63-f2f9c097f22b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.248 187791 DEBUG oslo_concurrency.lockutils [req-ce1b3c1b-f49c-4d61-b869-ae6de1e27103 req-5c7de652-bee2-4455-8b63-f2f9c097f22b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.248 187791 DEBUG oslo_concurrency.lockutils [req-ce1b3c1b-f49c-4d61-b869-ae6de1e27103 req-5c7de652-bee2-4455-8b63-f2f9c097f22b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.249 187791 DEBUG oslo_concurrency.lockutils [req-ce1b3c1b-f49c-4d61-b869-ae6de1e27103 req-5c7de652-bee2-4455-8b63-f2f9c097f22b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.249 187791 DEBUG nova.compute.manager [req-ce1b3c1b-f49c-4d61-b869-ae6de1e27103 req-5c7de652-bee2-4455-8b63-f2f9c097f22b 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Processing event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.250 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.257 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765224993.2560143, cb7797d1-3f0d-4927-8b0d-b6e999785b82 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.258 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] VM Resumed (Lifecycle Event)
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.260 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.265 187791 INFO nova.virt.libvirt.driver [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Instance spawned successfully.
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.265 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.288 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.296 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.300 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.301 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.302 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.302 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.303 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.304 187791 DEBUG nova.virt.libvirt.driver [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.333 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.361 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.377 187791 INFO nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Took 10.29 seconds to spawn the instance on the hypervisor.
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.378 187791 DEBUG nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:33 compute-0 sshd-session[216436]: Received disconnect from 45.78.217.210 port 42610:11: Bye Bye [preauth]
Dec 08 20:16:33 compute-0 sshd-session[216436]: Disconnected from authenticating user root 45.78.217.210 port 42610 [preauth]
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.440 187791 INFO nova.compute.manager [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Took 11.06 seconds to build instance.
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.458 187791 DEBUG oslo_concurrency.lockutils [None req-fc471362-46e8-4702-9872-15b89a9fa9a9 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:33 compute-0 ovn_controller[96170]: 2025-12-08T20:16:33Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:a1:23 10.100.0.10
Dec 08 20:16:33 compute-0 ovn_controller[96170]: 2025-12-08T20:16:33Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:a1:23 10.100.0.10
Dec 08 20:16:33 compute-0 ovn_controller[96170]: 2025-12-08T20:16:33Z|00089|binding|INFO|Releasing lport 18fa5a70-0bbf-4a90-b824-a803b1e1bef1 from this chassis (sb_readonly=0)
Dec 08 20:16:33 compute-0 ovn_controller[96170]: 2025-12-08T20:16:33Z|00090|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:33 compute-0 nova_compute[187787]: 2025-12-08 20:16:33.935 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.402 187791 DEBUG nova.compute.manager [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.403 187791 DEBUG oslo_concurrency.lockutils [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.404 187791 DEBUG oslo_concurrency.lockutils [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.404 187791 DEBUG oslo_concurrency.lockutils [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.405 187791 DEBUG nova.compute.manager [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] No waiting events found dispatching network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.405 187791 WARNING nova.compute.manager [req-3e1e1832-c753-4304-b3d6-3b4596cf7940 req-d471ab02-295b-4d85-9f6a-5f85bc6b688d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received unexpected event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 for instance with vm_state active and task_state None.
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.739 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.887 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.888 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.889 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.889 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.890 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.892 187791 INFO nova.compute.manager [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Terminating instance
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.894 187791 DEBUG nova.compute.manager [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:16:35 compute-0 kernel: tap820863a3-a6 (unregistering): left promiscuous mode
Dec 08 20:16:35 compute-0 NetworkManager[56229]: <info>  [1765224995.9183] device (tap820863a3-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:16:35 compute-0 ovn_controller[96170]: 2025-12-08T20:16:35Z|00091|binding|INFO|Releasing lport 820863a3-a6c4-4b66-962b-a1dd233d0420 from this chassis (sb_readonly=0)
Dec 08 20:16:35 compute-0 ovn_controller[96170]: 2025-12-08T20:16:35Z|00092|binding|INFO|Setting lport 820863a3-a6c4-4b66-962b-a1dd233d0420 down in Southbound
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.932 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:35 compute-0 ovn_controller[96170]: 2025-12-08T20:16:35Z|00093|binding|INFO|Removing iface tap820863a3-a6 ovn-installed in OVS
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.935 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:35.940 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:99:45 10.100.0.11'], port_security=['fa:16:3e:17:99:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'cb7797d1-3f0d-4927-8b0d-b6e999785b82', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f6265520-f130-4200-ac6a-92b78769cf12', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64781c8586b847fb8240984ca5fbf6fa', 'neutron:revision_number': '4', 'neutron:security_group_ids': '64456497-54bc-4a8a-b34b-88ab884bdf50', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b59c7489-ca7b-4a07-878a-369e272a2b1c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=820863a3-a6c4-4b66-962b-a1dd233d0420) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:16:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:35.941 105024 INFO neutron.agent.ovn.metadata.agent [-] Port 820863a3-a6c4-4b66-962b-a1dd233d0420 in datapath f6265520-f130-4200-ac6a-92b78769cf12 unbound from our chassis
Dec 08 20:16:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:35.942 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f6265520-f130-4200-ac6a-92b78769cf12, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:16:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:35.944 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2485300a-bb49-431a-93b6-623767cb95c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:35 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:35.945 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12 namespace which is not needed anymore
Dec 08 20:16:35 compute-0 nova_compute[187787]: 2025-12-08 20:16:35.952 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:35 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000006.scope: Deactivated successfully.
Dec 08 20:16:35 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000006.scope: Consumed 3.109s CPU time.
Dec 08 20:16:35 compute-0 systemd-machined[154122]: Machine qemu-7-instance-00000006 terminated.
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [NOTICE]   (216567) : haproxy version is 2.8.14-c23fe91
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [NOTICE]   (216567) : path to executable is /usr/sbin/haproxy
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [WARNING]  (216567) : Exiting Master process...
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [WARNING]  (216567) : Exiting Master process...
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [ALERT]    (216567) : Current worker (216569) exited with code 143 (Terminated)
Dec 08 20:16:36 compute-0 neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12[216561]: [WARNING]  (216567) : All workers exited. Exiting... (0)
Dec 08 20:16:36 compute-0 systemd[1]: libpod-bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909.scope: Deactivated successfully.
Dec 08 20:16:36 compute-0 podman[216603]: 2025-12-08 20:16:36.089088548 +0000 UTC m=+0.042292213 container died bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.166 187791 INFO nova.virt.libvirt.driver [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Instance destroyed successfully.
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.167 187791 DEBUG nova.objects.instance [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lazy-loading 'resources' on Instance uuid cb7797d1-3f0d-4927-8b0d-b6e999785b82 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909-userdata-shm.mount: Deactivated successfully.
Dec 08 20:16:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-9e076e20911ae3c3cc93dbe2afaa9bb38d2e65e535f0c3704f4b94725b5fa9d3-merged.mount: Deactivated successfully.
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.190 187791 DEBUG nova.virt.libvirt.vif [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:16:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-547578929',display_name='tempest-ServerAddressesTestJSON-server-547578929',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-547578929',id=6,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:16:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='64781c8586b847fb8240984ca5fbf6fa',ramdisk_id='',reservation_id='r-6x7gej7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-1485290880',owner_user_name='tempest-ServerAddressesTestJSON-1485290880-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:16:33Z,user_data=None,user_id='a4dcbe1bf6044eaf802a21c96f112cd7',uuid=cb7797d1-3f0d-4927-8b0d-b6e999785b82,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.190 187791 DEBUG nova.network.os_vif_util [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converting VIF {"id": "820863a3-a6c4-4b66-962b-a1dd233d0420", "address": "fa:16:3e:17:99:45", "network": {"id": "f6265520-f130-4200-ac6a-92b78769cf12", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-980096173-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64781c8586b847fb8240984ca5fbf6fa", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap820863a3-a6", "ovs_interfaceid": "820863a3-a6c4-4b66-962b-a1dd233d0420", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.191 187791 DEBUG nova.network.os_vif_util [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.191 187791 DEBUG os_vif [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.193 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.194 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap820863a3-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.196 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.197 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.198 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.200 187791 INFO os_vif [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:99:45,bridge_name='br-int',has_traffic_filtering=True,id=820863a3-a6c4-4b66-962b-a1dd233d0420,network=Network(f6265520-f130-4200-ac6a-92b78769cf12),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap820863a3-a6')
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.201 187791 INFO nova.virt.libvirt.driver [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Deleting instance files /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82_del
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.202 187791 INFO nova.virt.libvirt.driver [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Deletion of /var/lib/nova/instances/cb7797d1-3f0d-4927-8b0d-b6e999785b82_del complete
Dec 08 20:16:36 compute-0 podman[216603]: 2025-12-08 20:16:36.232891016 +0000 UTC m=+0.186094671 container cleanup bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 08 20:16:36 compute-0 systemd[1]: libpod-conmon-bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909.scope: Deactivated successfully.
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.264 187791 INFO nova.compute.manager [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Took 0.37 seconds to destroy the instance on the hypervisor.
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.265 187791 DEBUG oslo.service.loopingcall [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.265 187791 DEBUG nova.compute.manager [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.265 187791 DEBUG nova.network.neutron [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:16:36 compute-0 podman[216647]: 2025-12-08 20:16:36.315444885 +0000 UTC m=+0.048854959 container remove bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.322 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[131656b4-80cc-4cdf-bdd4-84caab2d2704]: (4, ('Mon Dec  8 08:16:36 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12 (bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909)\nbf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909\nMon Dec  8 08:16:36 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12 (bf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909)\nbf4e5daf6ae3603e6f9686c2d6c0176edf63e3d664842a2954df27488dae3909\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.324 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[44f8853d-3fa7-4eec-ad11-5e2a1915b31c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.325 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf6265520-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.439 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:36 compute-0 kernel: tapf6265520-f0: left promiscuous mode
Dec 08 20:16:36 compute-0 nova_compute[187787]: 2025-12-08 20:16:36.466 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.469 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[9030f385-4fa0-4701-9f7a-5bb9c65fccbc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.483 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0f1ae0-ff27-438d-9307-2c6c95a20d90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.484 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[31751312-220a-4d3f-a123-7079c92a2dea]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.503 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f7f503bb-f07b-4947-987b-0731d7394084]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352989, 'reachable_time': 42561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216662, 'error': None, 'target': 'ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.507 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f6265520-f130-4200-ac6a-92b78769cf12 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:16:36 compute-0 systemd[1]: run-netns-ovnmeta\x2df6265520\x2df130\x2d4200\x2dac6a\x2d92b78769cf12.mount: Deactivated successfully.
Dec 08 20:16:36 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:36.507 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[087616bd-b72d-41f1-91b4-8509f83f5a94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:36 compute-0 podman[216663]: 2025-12-08 20:16:36.616383711 +0000 UTC m=+0.073087882 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.784 187791 DEBUG nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-unplugged-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.784 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.784 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.785 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.785 187791 DEBUG nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] No waiting events found dispatching network-vif-unplugged-820863a3-a6c4-4b66-962b-a1dd233d0420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.785 187791 DEBUG nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-unplugged-820863a3-a6c4-4b66-962b-a1dd233d0420 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.785 187791 DEBUG nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.786 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.787 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.787 187791 DEBUG oslo_concurrency.lockutils [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.788 187791 DEBUG nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] No waiting events found dispatching network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:37 compute-0 nova_compute[187787]: 2025-12-08 20:16:37.788 187791 WARNING nova.compute.manager [req-070ba19a-76ff-4f37-95bd-08daf80f08bf req-5cf1f40d-6952-4e74-80f0-3ddf00754817 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received unexpected event network-vif-plugged-820863a3-a6c4-4b66-962b-a1dd233d0420 for instance with vm_state active and task_state deleting.
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.263 187791 DEBUG nova.network.neutron [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.286 187791 INFO nova.compute.manager [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Took 2.02 seconds to deallocate network for instance.
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.337 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.337 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.429 187791 DEBUG nova.compute.provider_tree [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.446 187791 DEBUG nova.scheduler.client.report [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:16:38 compute-0 sshd-session[216578]: Received disconnect from 101.47.160.247 port 49246:11: Bye Bye [preauth]
Dec 08 20:16:38 compute-0 sshd-session[216578]: Disconnected from authenticating user root 101.47.160.247 port 49246 [preauth]
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.471 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.502 187791 INFO nova.scheduler.client.report [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Deleted allocations for instance cb7797d1-3f0d-4927-8b0d-b6e999785b82
Dec 08 20:16:38 compute-0 nova_compute[187787]: 2025-12-08 20:16:38.570 187791 DEBUG oslo_concurrency.lockutils [None req-1033cff2-4e10-4f2c-ab41-eb8153d65d72 a4dcbe1bf6044eaf802a21c96f112cd7 64781c8586b847fb8240984ca5fbf6fa - - default default] Lock "cb7797d1-3f0d-4927-8b0d-b6e999785b82" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:39 compute-0 podman[216685]: 2025-12-08 20:16:39.49433769 +0000 UTC m=+0.067308081 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:16:39 compute-0 podman[216686]: 2025-12-08 20:16:39.49531547 +0000 UTC m=+0.065704290 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:16:40 compute-0 nova_compute[187787]: 2025-12-08 20:16:40.357 187791 DEBUG nova.compute.manager [req-3126805a-ca98-4203-9bfc-4fc33e71060b req-e0c80b67-917a-4ce8-8e5b-56c39e326583 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Received event network-vif-deleted-820863a3-a6c4-4b66-962b-a1dd233d0420 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:40 compute-0 nova_compute[187787]: 2025-12-08 20:16:40.741 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:41 compute-0 nova_compute[187787]: 2025-12-08 20:16:41.197 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:41 compute-0 nova_compute[187787]: 2025-12-08 20:16:41.294 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:42 compute-0 ovn_controller[96170]: 2025-12-08T20:16:42Z|00094|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:42 compute-0 nova_compute[187787]: 2025-12-08 20:16:42.076 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:45 compute-0 sshd-session[216725]: Invalid user dmdba from 47.76.127.165 port 58636
Dec 08 20:16:45 compute-0 sshd-session[216725]: Received disconnect from 47.76.127.165 port 58636:11: Bye Bye [preauth]
Dec 08 20:16:45 compute-0 sshd-session[216725]: Disconnected from invalid user dmdba 47.76.127.165 port 58636 [preauth]
Dec 08 20:16:45 compute-0 nova_compute[187787]: 2025-12-08 20:16:45.751 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:46 compute-0 nova_compute[187787]: 2025-12-08 20:16:46.199 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:47.227 105131 DEBUG eventlet.wsgi.server [-] (105131) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:47.229 105131 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: Accept: */*
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: Connection: close
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: Content-Type: text/plain
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: Host: 169.254.169.254
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: User-Agent: curl/7.84.0
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: X-Forwarded-For: 10.100.0.10
Dec 08 20:16:47 compute-0 ovn_metadata_agent[105019]: X-Ovn-Network-Id: 00649599-f843-461b-8984-ef8b1c5591f5 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 08 20:16:47 compute-0 sshd-session[216728]: Invalid user user from 200.155.38.219 port 64035
Dec 08 20:16:47 compute-0 podman[216731]: 2025-12-08 20:16:47.460181302 +0000 UTC m=+0.074708733 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:16:47 compute-0 podman[216730]: 2025-12-08 20:16:47.493973346 +0000 UTC m=+0.109755977 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 08 20:16:47 compute-0 sshd-session[216728]: Received disconnect from 200.155.38.219 port 64035:11: Bye Bye [preauth]
Dec 08 20:16:47 compute-0 sshd-session[216728]: Disconnected from invalid user user 200.155.38.219 port 64035 [preauth]
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.367 105131 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.368 105131 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 2.1384721
Dec 08 20:16:49 compute-0 haproxy-metadata-proxy-00649599-f843-461b-8984-ef8b1c5591f5[216304]: 10.100.0.10:47414 [08/Dec/2025:20:16:47.225] listener listener/metadata 0/0/0/2142/2142 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.483 105131 DEBUG eventlet.wsgi.server [-] (105131) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.484 105131 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: Accept: */*
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: Connection: close
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: Content-Length: 100
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: Content-Type: application/x-www-form-urlencoded
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: Host: 169.254.169.254
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: User-Agent: curl/7.84.0
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: X-Forwarded-For: 10.100.0.10
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: X-Ovn-Network-Id: 00649599-f843-461b-8984-ef8b1c5591f5
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.809 105131 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Dec 08 20:16:49 compute-0 haproxy-metadata-proxy-00649599-f843-461b-8984-ef8b1c5591f5[216304]: 10.100.0.10:47416 [08/Dec/2025:20:16:49.483] listener listener/metadata 0/0/0/327/327 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Dec 08 20:16:49 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:49.810 105131 INFO eventlet.wsgi.server [-] 10.100.0.10,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.3257141
Dec 08 20:16:50 compute-0 nova_compute[187787]: 2025-12-08 20:16:50.782 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.166 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765224996.1644173, cb7797d1-3f0d-4927-8b0d-b6e999785b82 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.167 187791 INFO nova.compute.manager [-] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] VM Stopped (Lifecycle Event)
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.191 187791 DEBUG nova.compute.manager [None req-ea896ffb-b7c0-4359-ad0b-3a1cde6682ea - - - - - -] [instance: cb7797d1-3f0d-4927-8b0d-b6e999785b82] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.201 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:51 compute-0 podman[216774]: 2025-12-08 20:16:51.483216935 +0000 UTC m=+0.056402527 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.683 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.683 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.684 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.684 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.684 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.685 187791 INFO nova.compute.manager [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Terminating instance
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.686 187791 DEBUG nova.compute.manager [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:16:51 compute-0 kernel: tapfe4ee820-4e (unregistering): left promiscuous mode
Dec 08 20:16:51 compute-0 NetworkManager[56229]: <info>  [1765225011.7160] device (tapfe4ee820-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:16:51 compute-0 ovn_controller[96170]: 2025-12-08T20:16:51Z|00095|binding|INFO|Releasing lport fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad from this chassis (sb_readonly=0)
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.726 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:51 compute-0 ovn_controller[96170]: 2025-12-08T20:16:51Z|00096|binding|INFO|Setting lport fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad down in Southbound
Dec 08 20:16:51 compute-0 ovn_controller[96170]: 2025-12-08T20:16:51Z|00097|binding|INFO|Removing iface tapfe4ee820-4e ovn-installed in OVS
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.729 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:51.734 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:a1:23 10.100.0.10'], port_security=['fa:16:3e:0e:a1:23 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00649599-f843-461b-8984-ef8b1c5591f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '67a776abf9054fd3b9fd5701a5c2a131', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a3601c45-bd2b-45f5-b04a-8b3857fccd88 f8a9470e-b0c0-4d28-a3b4-b8e2afe7ce40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7820fe20-07cc-462b-bc6d-c654b1086d88, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:16:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:51.735 105024 INFO neutron.agent.ovn.metadata.agent [-] Port fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad in datapath 00649599-f843-461b-8984-ef8b1c5591f5 unbound from our chassis
Dec 08 20:16:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:51.737 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00649599-f843-461b-8984-ef8b1c5591f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:16:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:51.739 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[075a4f39-b6b4-49a6-952e-f8f252465ab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:51 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:51.740 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5 namespace which is not needed anymore
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.748 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:51 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000005.scope: Deactivated successfully.
Dec 08 20:16:51 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000005.scope: Consumed 14.669s CPU time.
Dec 08 20:16:51 compute-0 systemd-machined[154122]: Machine qemu-6-instance-00000005 terminated.
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.978 187791 INFO nova.virt.libvirt.driver [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Instance destroyed successfully.
Dec 08 20:16:51 compute-0 nova_compute[187787]: 2025-12-08 20:16:51.979 187791 DEBUG nova.objects.instance [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lazy-loading 'resources' on Instance uuid 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.004 187791 DEBUG nova.virt.libvirt.vif [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-124047142',display_name='tempest-TestServerBasicOps-server-124047142',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testserverbasicops-server-124047142',id=5,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNQSZtyugIOV6BIaMv5PCdQ3D3cXktHKxGmMONWzOHwlzVu6FkI1m7OO549cPk49Q2U1ZPrlDAE5eikWrXfTWD3qAFA2v5gaa3i0yzRmMrrvetYGsXNRGfH3TBzcoY+1/Q==',key_name='tempest-TestServerBasicOps-1490320690',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:16:21Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='67a776abf9054fd3b9fd5701a5c2a131',ramdisk_id='',reservation_id='r-a72qcizu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-630825278',owner_user_name='tempest-TestServerBasicOps-630825278-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:16:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9b86c7d5501a42d0bc8d49585ff3a697',uuid=4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.005 187791 DEBUG nova.network.os_vif_util [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converting VIF {"id": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "address": "fa:16:3e:0e:a1:23", "network": {"id": "00649599-f843-461b-8984-ef8b1c5591f5", "bridge": "br-int", "label": "tempest-TestServerBasicOps-542923508-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "67a776abf9054fd3b9fd5701a5c2a131", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe4ee820-4e", "ovs_interfaceid": "fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.006 187791 DEBUG nova.network.os_vif_util [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.006 187791 DEBUG os_vif [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.008 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.009 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe4ee820-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.017 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.019 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.022 187791 INFO os_vif [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:a1:23,bridge_name='br-int',has_traffic_filtering=True,id=fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad,network=Network(00649599-f843-461b-8984-ef8b1c5591f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe4ee820-4e')
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.023 187791 INFO nova.virt.libvirt.driver [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Deleting instance files /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6_del
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.024 187791 INFO nova.virt.libvirt.driver [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Deletion of /var/lib/nova/instances/4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6_del complete
Dec 08 20:16:52 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [NOTICE]   (216302) : haproxy version is 2.8.14-c23fe91
Dec 08 20:16:52 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [NOTICE]   (216302) : path to executable is /usr/sbin/haproxy
Dec 08 20:16:52 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [WARNING]  (216302) : Exiting Master process...
Dec 08 20:16:52 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [ALERT]    (216302) : Current worker (216304) exited with code 143 (Terminated)
Dec 08 20:16:52 compute-0 neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5[216298]: [WARNING]  (216302) : All workers exited. Exiting... (0)
Dec 08 20:16:52 compute-0 systemd[1]: libpod-8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2.scope: Deactivated successfully.
Dec 08 20:16:52 compute-0 podman[216822]: 2025-12-08 20:16:52.051154788 +0000 UTC m=+0.192321087 container died 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.099 187791 INFO nova.compute.manager [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.099 187791 DEBUG oslo.service.loopingcall [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.100 187791 DEBUG nova.compute.manager [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.100 187791 DEBUG nova.network.neutron [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:16:52 compute-0 ovn_controller[96170]: 2025-12-08T20:16:52Z|00098|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.404 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 ovn_controller[96170]: 2025-12-08T20:16:52Z|00099|binding|INFO|Releasing lport 1c8adead-c286-4358-bc66-014ea62fec5c from this chassis (sb_readonly=0)
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.589 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2-userdata-shm.mount: Deactivated successfully.
Dec 08 20:16:52 compute-0 systemd[1]: var-lib-containers-storage-overlay-b7bb8229d5f2629b09ed2a4be4b50afda9e1efcffe0761f51f2c2ced345198b0-merged.mount: Deactivated successfully.
Dec 08 20:16:52 compute-0 podman[216822]: 2025-12-08 20:16:52.722324282 +0000 UTC m=+0.863490591 container cleanup 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:16:52 compute-0 systemd[1]: libpod-conmon-8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2.scope: Deactivated successfully.
Dec 08 20:16:52 compute-0 podman[216869]: 2025-12-08 20:16:52.839059877 +0000 UTC m=+0.091012537 container remove 8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.847 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[1cce1bda-d0bf-4fcb-a921-e35c7792172f]: (4, ('Mon Dec  8 08:16:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5 (8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2)\n8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2\nMon Dec  8 08:16:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5 (8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2)\n8f448a4396ef0068603f98e9cb8a0d64152d42a8ce4fce7efd42647d067e14c2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.849 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[3f170f47-d30b-45fe-9785-596062294bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.850 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00649599-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.853 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 kernel: tap00649599-f0: left promiscuous mode
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.867 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.868 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.869 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7e442b-8c46-4614-857a-dadd31a4441b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.875 187791 DEBUG nova.compute.manager [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-unplugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.875 187791 DEBUG oslo_concurrency.lockutils [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.875 187791 DEBUG oslo_concurrency.lockutils [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.876 187791 DEBUG oslo_concurrency.lockutils [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.876 187791 DEBUG nova.compute.manager [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] No waiting events found dispatching network-vif-unplugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:52 compute-0 nova_compute[187787]: 2025-12-08 20:16:52.876 187791 DEBUG nova.compute.manager [req-80c5118f-6ad4-4139-b7f9-f151fdfa785b req-186c1ea9-eac0-4e5d-a7f7-09a74876e0f8 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-unplugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.887 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[ae539e95-1020-4591-ad6f-ea43e9f025ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.888 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[dac637c7-e0ed-4bc8-b72b-5b51fc227639]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.909 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7af542-2054-4824-9688-2b18f481482f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 351281, 'reachable_time': 38024, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216883, 'error': None, 'target': 'ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.912 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00649599-f843-461b-8984-ef8b1c5591f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:16:52 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:52.912 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[7163a7e0-4d0a-4054-90dd-89569bb975ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:16:52 compute-0 systemd[1]: run-netns-ovnmeta\x2d00649599\x2df843\x2d461b\x2d8984\x2def8b1c5591f5.mount: Deactivated successfully.
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.008 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:53.008 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:16:53 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:53.011 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.679 187791 DEBUG nova.network.neutron [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.712 187791 INFO nova.compute.manager [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Took 1.61 seconds to deallocate network for instance.
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.769 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.770 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.795 187791 DEBUG nova.compute.manager [req-965f7084-9ebe-487f-b032-4db5480595be req-63d50f47-46c0-4289-92fc-880073cd02da 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-deleted-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.838 187791 DEBUG nova.compute.provider_tree [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.854 187791 DEBUG nova.scheduler.client.report [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.879 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:53 compute-0 nova_compute[187787]: 2025-12-08 20:16:53.912 187791 INFO nova.scheduler.client.report [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Deleted allocations for instance 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6
Dec 08 20:16:54 compute-0 nova_compute[187787]: 2025-12-08 20:16:54.000 187791 DEBUG oslo_concurrency.lockutils [None req-2ebe87c2-8c18-40eb-8f83-61a7e26fc9f3 9b86c7d5501a42d0bc8d49585ff3a697 67a776abf9054fd3b9fd5701a5c2a131 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.317s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:54.989 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:54.990 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:54.990 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.085 187791 DEBUG nova.compute.manager [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.085 187791 DEBUG oslo_concurrency.lockutils [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.086 187791 DEBUG oslo_concurrency.lockutils [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.086 187791 DEBUG oslo_concurrency.lockutils [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.086 187791 DEBUG nova.compute.manager [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] No waiting events found dispatching network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.087 187791 WARNING nova.compute.manager [req-54a58388-49d5-49d5-ab80-d811ae8204c9 req-c1a81a6a-32e5-46f6-aae3-ef7ba2b77e4d 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Received unexpected event network-vif-plugged-fe4ee820-4ef4-491d-8ab8-b0cabfe2a5ad for instance with vm_state deleted and task_state None.
Dec 08 20:16:55 compute-0 nova_compute[187787]: 2025-12-08 20:16:55.785 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:57 compute-0 nova_compute[187787]: 2025-12-08 20:16:57.019 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:16:59 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:16:59.014 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:16:59 compute-0 podman[202017]: time="2025-12-08T20:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:16:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:16:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Dec 08 20:17:00 compute-0 nova_compute[187787]: 2025-12-08 20:17:00.794 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: ERROR   20:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: ERROR   20:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: ERROR   20:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: ERROR   20:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: ERROR   20:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:17:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:17:01 compute-0 podman[216885]: 2025-12-08 20:17:01.532324998 +0000 UTC m=+0.070873606 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.4, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Dec 08 20:17:02 compute-0 nova_compute[187787]: 2025-12-08 20:17:02.021 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:05 compute-0 nova_compute[187787]: 2025-12-08 20:17:05.790 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:06 compute-0 nova_compute[187787]: 2025-12-08 20:17:06.974 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765225011.9726715, 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:17:06 compute-0 nova_compute[187787]: 2025-12-08 20:17:06.975 187791 INFO nova.compute.manager [-] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] VM Stopped (Lifecycle Event)
Dec 08 20:17:07 compute-0 nova_compute[187787]: 2025-12-08 20:17:07.008 187791 DEBUG nova.compute.manager [None req-f28b1abd-95f1-4490-828f-b3eefb83aada - - - - - -] [instance: 4397f1e8-3fd6-4dfd-82f9-495ffcd67ec6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:17:07 compute-0 nova_compute[187787]: 2025-12-08 20:17:07.057 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:07 compute-0 podman[216906]: 2025-12-08 20:17:07.518750666 +0000 UTC m=+0.080029671 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 08 20:17:10 compute-0 podman[216929]: 2025-12-08 20:17:10.486457027 +0000 UTC m=+0.051943937 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:17:10 compute-0 podman[216928]: 2025-12-08 20:17:10.489098379 +0000 UTC m=+0.060980698 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:17:10 compute-0 nova_compute[187787]: 2025-12-08 20:17:10.793 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:12 compute-0 nova_compute[187787]: 2025-12-08 20:17:12.059 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:15 compute-0 nova_compute[187787]: 2025-12-08 20:17:15.795 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:17 compute-0 nova_compute[187787]: 2025-12-08 20:17:17.063 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:18 compute-0 podman[216970]: 2025-12-08 20:17:18.517753594 +0000 UTC m=+0.079224976 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Dec 08 20:17:18 compute-0 podman[216969]: 2025-12-08 20:17:18.520671585 +0000 UTC m=+0.088677961 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 08 20:17:18 compute-0 nova_compute[187787]: 2025-12-08 20:17:18.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.835 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.836 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.836 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.836 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.839 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac119a0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.851 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.852 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:17:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:17:20 compute-0 nova_compute[187787]: 2025-12-08 20:17:20.797 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:17:21 compute-0 nova_compute[187787]: 2025-12-08 20:17:21.819 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:17:21 compute-0 podman[217017]: 2025-12-08 20:17:21.956409098 +0000 UTC m=+0.094041717 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.017 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.018 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5673MB free_disk=72.88076400756836GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.018 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.018 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.069 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.099 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.100 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.127 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.149 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.189 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:17:22 compute-0 nova_compute[187787]: 2025-12-08 20:17:22.190 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.191 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.192 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.192 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.209 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:23 compute-0 nova_compute[187787]: 2025-12-08 20:17:23.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:24 compute-0 nova_compute[187787]: 2025-12-08 20:17:24.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:24 compute-0 nova_compute[187787]: 2025-12-08 20:17:24.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:17:25 compute-0 nova_compute[187787]: 2025-12-08 20:17:25.860 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:27 compute-0 nova_compute[187787]: 2025-12-08 20:17:27.072 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:29 compute-0 podman[202017]: time="2025-12-08T20:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:17:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:17:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Dec 08 20:17:30 compute-0 nova_compute[187787]: 2025-12-08 20:17:30.863 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: ERROR   20:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: ERROR   20:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: ERROR   20:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: ERROR   20:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: ERROR   20:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:17:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:17:32 compute-0 nova_compute[187787]: 2025-12-08 20:17:32.074 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:32 compute-0 podman[217041]: 2025-12-08 20:17:32.508470099 +0000 UTC m=+0.071259368 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4)
Dec 08 20:17:35 compute-0 nova_compute[187787]: 2025-12-08 20:17:35.867 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:37 compute-0 nova_compute[187787]: 2025-12-08 20:17:37.080 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:38 compute-0 podman[217061]: 2025-12-08 20:17:38.532071674 +0000 UTC m=+0.081793306 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 08 20:17:40 compute-0 nova_compute[187787]: 2025-12-08 20:17:40.900 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:41 compute-0 ovn_controller[96170]: 2025-12-08T20:17:41Z|00100|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Dec 08 20:17:41 compute-0 podman[217082]: 2025-12-08 20:17:41.521220813 +0000 UTC m=+0.060653779 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:17:41 compute-0 podman[217083]: 2025-12-08 20:17:41.526433035 +0000 UTC m=+0.060379999 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true)
Dec 08 20:17:42 compute-0 nova_compute[187787]: 2025-12-08 20:17:42.084 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:44 compute-0 sshd-session[217123]: Invalid user oo from 103.172.28.62 port 56352
Dec 08 20:17:44 compute-0 sshd-session[217123]: Received disconnect from 103.172.28.62 port 56352:11: Bye Bye [preauth]
Dec 08 20:17:44 compute-0 sshd-session[217123]: Disconnected from invalid user oo 103.172.28.62 port 56352 [preauth]
Dec 08 20:17:45 compute-0 nova_compute[187787]: 2025-12-08 20:17:45.934 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:47 compute-0 nova_compute[187787]: 2025-12-08 20:17:47.127 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:49 compute-0 podman[217127]: 2025-12-08 20:17:49.52653797 +0000 UTC m=+0.084436407 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 08 20:17:49 compute-0 podman[217126]: 2025-12-08 20:17:49.55866699 +0000 UTC m=+0.117128085 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:17:50 compute-0 nova_compute[187787]: 2025-12-08 20:17:50.936 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:52 compute-0 nova_compute[187787]: 2025-12-08 20:17:52.131 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:52 compute-0 podman[217174]: 2025-12-08 20:17:52.516665489 +0000 UTC m=+0.079009629 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:17:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:17:54.991 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:17:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:17:54.991 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:17:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:17:54.991 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:17:55 compute-0 nova_compute[187787]: 2025-12-08 20:17:55.938 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:57 compute-0 nova_compute[187787]: 2025-12-08 20:17:57.134 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:57 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:17:57.279 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:17:57 compute-0 nova_compute[187787]: 2025-12-08 20:17:57.280 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:17:57 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:17:57.283 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:17:59 compute-0 podman[202017]: time="2025-12-08T20:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:17:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:17:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3456 "" "Go-http-client/1.1"
Dec 08 20:18:00 compute-0 nova_compute[187787]: 2025-12-08 20:18:00.993 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:01 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:01.286 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: ERROR   20:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: ERROR   20:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: ERROR   20:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: ERROR   20:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: ERROR   20:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:18:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:18:02 compute-0 nova_compute[187787]: 2025-12-08 20:18:02.186 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:03 compute-0 podman[217199]: 2025-12-08 20:18:03.50324965 +0000 UTC m=+0.064037063 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:18:06 compute-0 nova_compute[187787]: 2025-12-08 20:18:06.041 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:07 compute-0 nova_compute[187787]: 2025-12-08 20:18:07.223 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:09 compute-0 podman[217221]: 2025-12-08 20:18:09.515230014 +0000 UTC m=+0.081715983 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9)
Dec 08 20:18:11 compute-0 nova_compute[187787]: 2025-12-08 20:18:11.074 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:12 compute-0 nova_compute[187787]: 2025-12-08 20:18:12.264 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:12 compute-0 podman[217245]: 2025-12-08 20:18:12.525733106 +0000 UTC m=+0.079293799 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:18:12 compute-0 podman[217244]: 2025-12-08 20:18:12.564167541 +0000 UTC m=+0.118879999 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:18:13 compute-0 sshd-session[217242]: Received disconnect from 45.174.162.68 port 6412:11: Bye Bye [preauth]
Dec 08 20:18:13 compute-0 sshd-session[217242]: Disconnected from authenticating user root 45.174.162.68 port 6412 [preauth]
Dec 08 20:18:15 compute-0 sshd-session[217288]: Received disconnect from 47.76.127.165 port 48768:11: Bye Bye [preauth]
Dec 08 20:18:15 compute-0 sshd-session[217288]: Disconnected from authenticating user root 47.76.127.165 port 48768 [preauth]
Dec 08 20:18:16 compute-0 nova_compute[187787]: 2025-12-08 20:18:16.076 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.313 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.724 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.725 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.754 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.870 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.871 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.885 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:18:17 compute-0 nova_compute[187787]: 2025-12-08 20:18:17.887 187791 INFO nova.compute.claims [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.007 187791 DEBUG nova.compute.provider_tree [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.023 187791 DEBUG nova.scheduler.client.report [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.053 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.055 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.117 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.117 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.140 187791 INFO nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.157 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.282 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.284 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.284 187791 INFO nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Creating image(s)
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.285 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.286 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.287 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.311 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.383 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.384 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.385 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.397 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.419 187791 DEBUG nova.policy [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '87606be528da4d588b06cc2635781b15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.455 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.456 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.502 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.504 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.504 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.569 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.570 187791 DEBUG nova.virt.disk.api [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Checking if we can resize image /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.571 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.630 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.631 187791 DEBUG nova.virt.disk.api [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Cannot resize image /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.632 187791 DEBUG nova.objects.instance [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'migration_context' on Instance uuid 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.663 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.664 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Ensure instance console log exists: /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.664 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.665 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:18 compute-0 nova_compute[187787]: 2025-12-08 20:18:18.665 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:19 compute-0 nova_compute[187787]: 2025-12-08 20:18:19.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:20 compute-0 nova_compute[187787]: 2025-12-08 20:18:20.216 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Successfully created port: ea23d441-1529-4558-b8e2-b0240af97aef _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:18:20 compute-0 podman[217307]: 2025-12-08 20:18:20.511574408 +0000 UTC m=+0.079124403 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:18:20 compute-0 podman[217306]: 2025-12-08 20:18:20.543070037 +0000 UTC m=+0.114278936 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller)
Dec 08 20:18:20 compute-0 nova_compute[187787]: 2025-12-08 20:18:20.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:20 compute-0 nova_compute[187787]: 2025-12-08 20:18:20.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 08 20:18:20 compute-0 nova_compute[187787]: 2025-12-08 20:18:20.805 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.083 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.165 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Successfully updated port: ea23d441-1529-4558-b8e2-b0240af97aef _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.189 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.190 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquired lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.190 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.268 187791 DEBUG nova.compute.manager [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.268 187791 DEBUG nova.compute.manager [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing instance network info cache due to event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.269 187791 DEBUG oslo_concurrency.lockutils [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.394 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.804 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:21 compute-0 nova_compute[187787]: 2025-12-08 20:18:21.805 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.302 187791 DEBUG nova.network.neutron [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updating instance_info_cache with network_info: [{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.350 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.359 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Releasing lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.360 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Instance network_info: |[{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.360 187791 DEBUG oslo_concurrency.lockutils [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.360 187791 DEBUG nova.network.neutron [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.366 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Start _get_guest_xml network_info=[{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.373 187791 WARNING nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.393 187791 DEBUG nova.virt.libvirt.host [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.395 187791 DEBUG nova.virt.libvirt.host [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.402 187791 DEBUG nova.virt.libvirt.host [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.403 187791 DEBUG nova.virt.libvirt.host [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.403 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.404 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.405 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.405 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.406 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.406 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.407 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.407 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.407 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.408 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.408 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.409 187791 DEBUG nova.virt.hardware [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.417 187791 DEBUG nova.virt.libvirt.vif [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:18:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-260707184',display_name='tempest-TestNetworkBasicOps-server-260707184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-260707184',id=7,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLd2cSTEUvFxgsV2ieVQZVMjc1Ieym1ixwtjsuBGMy7n7mrLOVqiBR013Z9hdxuXV9gHdwd006mZ3wW76WoWGANP40iQpeRarJ9oX8affq75YCUFwMGk1QpFwTF1rnEDIw==',key_name='tempest-TestNetworkBasicOps-2036254642',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-le16w1u1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:18:18Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=65bbe4d2-0789-4405-9b37-2d5bd7b5f5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.418 187791 DEBUG nova.network.os_vif_util [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.419 187791 DEBUG nova.network.os_vif_util [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.421 187791 DEBUG nova.objects.instance [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.448 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <uuid>65bbe4d2-0789-4405-9b37-2d5bd7b5f5af</uuid>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <name>instance-00000007</name>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:name>tempest-TestNetworkBasicOps-server-260707184</nova:name>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:18:22</nova:creationTime>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:user uuid="87606be528da4d588b06cc2635781b15">tempest-TestNetworkBasicOps-827289320-project-member</nova:user>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:project uuid="1eb0a13dd3f749b583ab1cf652d42ead">tempest-TestNetworkBasicOps-827289320</nova:project>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         <nova:port uuid="ea23d441-1529-4558-b8e2-b0240af97aef">
Dec 08 20:18:22 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <system>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="serial">65bbe4d2-0789-4405-9b37-2d5bd7b5f5af</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="uuid">65bbe4d2-0789-4405-9b37-2d5bd7b5f5af</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </system>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <os>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </os>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <features>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </features>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.config"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:e8:9e:29"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <target dev="tapea23d441-15"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/console.log" append="off"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <video>
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </video>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:18:22 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:18:22 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:18:22 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:18:22 compute-0 nova_compute[187787]: </domain>
Dec 08 20:18:22 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.450 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Preparing to wait for external event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.451 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.451 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.452 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.453 187791 DEBUG nova.virt.libvirt.vif [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:18:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-260707184',display_name='tempest-TestNetworkBasicOps-server-260707184',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-260707184',id=7,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLd2cSTEUvFxgsV2ieVQZVMjc1Ieym1ixwtjsuBGMy7n7mrLOVqiBR013Z9hdxuXV9gHdwd006mZ3wW76WoWGANP40iQpeRarJ9oX8affq75YCUFwMGk1QpFwTF1rnEDIw==',key_name='tempest-TestNetworkBasicOps-2036254642',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-le16w1u1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:18:18Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=65bbe4d2-0789-4405-9b37-2d5bd7b5f5af,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.454 187791 DEBUG nova.network.os_vif_util [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.455 187791 DEBUG nova.network.os_vif_util [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.455 187791 DEBUG os_vif [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.457 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.457 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.458 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.468 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.469 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapea23d441-15, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.470 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapea23d441-15, col_values=(('external_ids', {'iface-id': 'ea23d441-1529-4558-b8e2-b0240af97aef', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e8:9e:29', 'vm-uuid': '65bbe4d2-0789-4405-9b37-2d5bd7b5f5af'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.472 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.475 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:18:22 compute-0 NetworkManager[56229]: <info>  [1765225102.4764] manager: (tapea23d441-15): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.482 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.484 187791 INFO os_vif [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15')
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.555 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.555 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.555 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No VIF found with MAC fa:16:3e:e8:9e:29, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.556 187791 INFO nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Using config drive
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.782 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.806 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.806 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.937 187791 INFO nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Creating config drive at /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.config
Dec 08 20:18:22 compute-0 nova_compute[187787]: 2025-12-08 20:18:22.945 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi40gufxa execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.084 187791 DEBUG oslo_concurrency.processutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi40gufxa" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:23 compute-0 kernel: tapea23d441-15: entered promiscuous mode
Dec 08 20:18:23 compute-0 ovn_controller[96170]: 2025-12-08T20:18:23Z|00101|binding|INFO|Claiming lport ea23d441-1529-4558-b8e2-b0240af97aef for this chassis.
Dec 08 20:18:23 compute-0 ovn_controller[96170]: 2025-12-08T20:18:23Z|00102|binding|INFO|ea23d441-1529-4558-b8e2-b0240af97aef: Claiming fa:16:3e:e8:9e:29 10.100.0.13
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.1794] manager: (tapea23d441-15): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.178 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.182 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.198 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:9e:29 10.100.0.13'], port_security=['fa:16:3e:e8:9e:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65bbe4d2-0789-4405-9b37-2d5bd7b5f5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89a3e5bc-6928-489f-879e-9016cdae8e36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '81ee238d-55fc-4374-aa3f-73da44b1a064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11170bf2-0a3b-4c62-b0d3-3f66ab038b6b, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=ea23d441-1529-4558-b8e2-b0240af97aef) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.216 105024 INFO neutron.agent.ovn.metadata.agent [-] Port ea23d441-1529-4558-b8e2-b0240af97aef in datapath 89a3e5bc-6928-489f-879e-9016cdae8e36 bound to our chassis
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.218 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89a3e5bc-6928-489f-879e-9016cdae8e36
Dec 08 20:18:23 compute-0 systemd-udevd[217390]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:18:23 compute-0 systemd-machined[154122]: New machine qemu-8-instance-00000007.
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.244 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7b6245-5c3f-4049-932b-ca28236cdd1d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.246 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap89a3e5bc-61 in ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Dec 08 20:18:23 compute-0 podman[217364]: 2025-12-08 20:18:23.246412454 +0000 UTC m=+0.078923587 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.251 214668 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap89a3e5bc-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.251 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[e5c427f8-7d7f-4b55-a4eb-10da903d3030]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.252 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[59185540-9054-461b-a741-df355e79416e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.2562] device (tapea23d441-15): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.2581] device (tapea23d441-15): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.266 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.270 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 ovn_controller[96170]: 2025-12-08T20:18:23Z|00103|binding|INFO|Setting lport ea23d441-1529-4558-b8e2-b0240af97aef ovn-installed in OVS
Dec 08 20:18:23 compute-0 ovn_controller[96170]: 2025-12-08T20:18:23Z|00104|binding|INFO|Setting lport ea23d441-1529-4558-b8e2-b0240af97aef up in Southbound
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.276 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-00000007.
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.276 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[79723faf-1d76-4a7b-8311-f9ab9fcc1679]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.306 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[192b002b-c362-408b-8f66-a9b1c47f2757]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.343 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[297085e4-5074-4f86-84ce-13b58be6c134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.350 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[43035622-fb3d-4030-b67d-435167d3a9e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.3522] manager: (tap89a3e5bc-60): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Dec 08 20:18:23 compute-0 systemd-udevd[217397]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.391 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[3fbf9dcd-2d71-411b-bb44-29e628145f62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.395 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[ca16e74a-3f68-4ceb-8dc4-dac4e2a0ef85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.4191] device (tap89a3e5bc-60): carrier: link connected
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.425 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[5e1ef4ec-49da-43b1-b479-f82dcce2131f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.449 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f01210bd-4c82-43b9-a25f-35bac12bb264]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89a3e5bc-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:e4:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364101, 'reachable_time': 42907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217427, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.466 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[38829694-8120-4ebd-ac9d-2e0927af09d9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe84:e48d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364101, 'tstamp': 364101}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217428, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.500 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[4361a7cc-461a-4bea-a74a-4b4ec29a940c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89a3e5bc-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:e4:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364101, 'reachable_time': 42907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217429, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.545 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[743199a6-14d7-461d-a9e6-4c42d51c30ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.596 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225103.595662, 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.596 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] VM Started (Lifecycle Event)
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.608 187791 DEBUG nova.compute.manager [req-029c0284-51c5-4823-83da-af5cf1159fe2 req-af42a6c9-3423-476d-a2b2-6be4af2b08e1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.608 187791 DEBUG oslo_concurrency.lockutils [req-029c0284-51c5-4823-83da-af5cf1159fe2 req-af42a6c9-3423-476d-a2b2-6be4af2b08e1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.609 187791 DEBUG oslo_concurrency.lockutils [req-029c0284-51c5-4823-83da-af5cf1159fe2 req-af42a6c9-3423-476d-a2b2-6be4af2b08e1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.609 187791 DEBUG oslo_concurrency.lockutils [req-029c0284-51c5-4823-83da-af5cf1159fe2 req-af42a6c9-3423-476d-a2b2-6be4af2b08e1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.609 187791 DEBUG nova.compute.manager [req-029c0284-51c5-4823-83da-af5cf1159fe2 req-af42a6c9-3423-476d-a2b2-6be4af2b08e1 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Processing event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.610 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.616 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.620 187791 INFO nova.virt.libvirt.driver [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Instance spawned successfully.
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.620 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.636 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[63dcbf56-17d3-4c00-9bb7-10b3bb5aa563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.638 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89a3e5bc-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.639 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.640 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89a3e5bc-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:23 compute-0 NetworkManager[56229]: <info>  [1765225103.6437] manager: (tap89a3e5bc-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 08 20:18:23 compute-0 kernel: tap89a3e5bc-60: entered promiscuous mode
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.644 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.647 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.649 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89a3e5bc-60, col_values=(('external_ids', {'iface-id': 'b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.650 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 ovn_controller[96170]: 2025-12-08T20:18:23Z|00105|binding|INFO|Releasing lport b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99 from this chassis (sb_readonly=0)
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.652 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.653 105024 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/89a3e5bc-6928-489f-879e-9016cdae8e36.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/89a3e5bc-6928-489f-879e-9016cdae8e36.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.654 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.654 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f0916c76-49ac-4d8d-be2e-0699a5be0e59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.655 105024 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: global
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     log         /dev/log local0 debug
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     log-tag     haproxy-metadata-proxy-89a3e5bc-6928-489f-879e-9016cdae8e36
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     user        root
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     group       root
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     maxconn     1024
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     pidfile     /var/lib/neutron/external/pids/89a3e5bc-6928-489f-879e-9016cdae8e36.pid.haproxy
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     daemon
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: defaults
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     log global
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     mode http
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     option httplog
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     option dontlognull
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     option http-server-close
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     option forwardfor
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     retries                 3
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     timeout http-request    30s
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     timeout connect         30s
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     timeout client          32s
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     timeout server          32s
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     timeout http-keep-alive 30s
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: listen listener
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     bind 169.254.169.254:80
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     server metadata /var/lib/neutron/metadata_proxy
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:     http-request add-header X-OVN-Network-ID 89a3e5bc-6928-489f-879e-9016cdae8e36
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Dec 08 20:18:23 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:23.656 105024 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'env', 'PROCESS_TAG=haproxy-89a3e5bc-6928-489f-879e-9016cdae8e36', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/89a3e5bc-6928-489f-879e-9016cdae8e36.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.660 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.660 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.661 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.661 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.661 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.662 187791 DEBUG nova.virt.libvirt.driver [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.665 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.697 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.698 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225103.5957558, 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.698 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] VM Paused (Lifecycle Event)
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.745 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.749 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225103.61478, 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.749 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] VM Resumed (Lifecycle Event)
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.760 187791 INFO nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Took 5.48 seconds to spawn the instance on the hypervisor.
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.761 187791 DEBUG nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.769 187791 DEBUG nova.network.neutron [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updated VIF entry in instance network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.769 187791 DEBUG nova.network.neutron [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updating instance_info_cache with network_info: [{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.775 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.778 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.847 187791 DEBUG oslo_concurrency.lockutils [req-01ae1fd0-ff6a-4d28-bb1b-5daf384e8ffb req-76efea93-e2bd-4274-bdbd-e2b77118278a 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.848 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.864 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.864 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.865 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.865 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.876 187791 INFO nova.compute.manager [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Took 6.05 seconds to build instance.
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.898 187791 DEBUG oslo_concurrency.lockutils [None req-a4f132a7-b7ba-40cc-b699-93a994da7f8b 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:23 compute-0 nova_compute[187787]: 2025-12-08 20:18:23.942 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.007 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.008 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.067 187791 DEBUG oslo_concurrency.processutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:24 compute-0 podman[217471]: 2025-12-08 20:18:24.119018054 +0000 UTC m=+0.071678221 container create 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 08 20:18:24 compute-0 systemd[1]: Started libpod-conmon-18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe.scope.
Dec 08 20:18:24 compute-0 podman[217471]: 2025-12-08 20:18:24.076519221 +0000 UTC m=+0.029179408 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 08 20:18:24 compute-0 systemd[1]: Started libcrun container.
Dec 08 20:18:24 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58bd87a7b3b7a03bbaa30f57c41cbdb29d59bdfe9895969c682f75d17567550b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 08 20:18:24 compute-0 podman[217471]: 2025-12-08 20:18:24.212649388 +0000 UTC m=+0.165309585 container init 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 08 20:18:24 compute-0 podman[217471]: 2025-12-08 20:18:24.218557871 +0000 UTC m=+0.171218038 container start 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 08 20:18:24 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [NOTICE]   (217494) : New worker (217496) forked
Dec 08 20:18:24 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [NOTICE]   (217494) : Loading success.
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.291 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.292 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5581MB free_disk=72.88000106811523GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.292 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.293 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.497 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Instance 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.498 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.498 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.635 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.823 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.888 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.889 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.889 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.890 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 08 20:18:24 compute-0 nova_compute[187787]: 2025-12-08 20:18:24.912 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.712 187791 DEBUG nova.compute.manager [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.712 187791 DEBUG oslo_concurrency.lockutils [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.713 187791 DEBUG oslo_concurrency.lockutils [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.713 187791 DEBUG oslo_concurrency.lockutils [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.713 187791 DEBUG nova.compute.manager [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] No waiting events found dispatching network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.714 187791 WARNING nova.compute.manager [req-db827c26-b00e-4b82-8de4-a29c6b756204 req-d563440e-1b2e-4fa3-92ff-3a632581eef0 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received unexpected event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef for instance with vm_state active and task_state None.
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.947 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.980 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.980 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:25 compute-0 nova_compute[187787]: 2025-12-08 20:18:25.981 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:26 compute-0 nova_compute[187787]: 2025-12-08 20:18:26.106 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:27 compute-0 nova_compute[187787]: 2025-12-08 20:18:27.472 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:27 compute-0 nova_compute[187787]: 2025-12-08 20:18:27.557 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:27 compute-0 NetworkManager[56229]: <info>  [1765225107.5583] manager: (patch-br-int-to-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec 08 20:18:27 compute-0 NetworkManager[56229]: <info>  [1765225107.5589] manager: (patch-provnet-93fa0f7e-db91-456a-ac4d-9c874efab705-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Dec 08 20:18:27 compute-0 ovn_controller[96170]: 2025-12-08T20:18:27Z|00106|binding|INFO|Releasing lport b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99 from this chassis (sb_readonly=0)
Dec 08 20:18:27 compute-0 ovn_controller[96170]: 2025-12-08T20:18:27Z|00107|binding|INFO|Releasing lport b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99 from this chassis (sb_readonly=0)
Dec 08 20:18:27 compute-0 nova_compute[187787]: 2025-12-08 20:18:27.586 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:27 compute-0 nova_compute[187787]: 2025-12-08 20:18:27.591 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:28 compute-0 nova_compute[187787]: 2025-12-08 20:18:28.111 187791 DEBUG nova.compute.manager [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:28 compute-0 nova_compute[187787]: 2025-12-08 20:18:28.112 187791 DEBUG nova.compute.manager [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing instance network info cache due to event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:18:28 compute-0 nova_compute[187787]: 2025-12-08 20:18:28.112 187791 DEBUG oslo_concurrency.lockutils [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:28 compute-0 nova_compute[187787]: 2025-12-08 20:18:28.113 187791 DEBUG oslo_concurrency.lockutils [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:28 compute-0 nova_compute[187787]: 2025-12-08 20:18:28.113 187791 DEBUG nova.network.neutron [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:18:28 compute-0 sshd-session[217506]: Invalid user mc from 200.155.38.219 port 18151
Dec 08 20:18:28 compute-0 sshd-session[217506]: Received disconnect from 200.155.38.219 port 18151:11: Bye Bye [preauth]
Dec 08 20:18:28 compute-0 sshd-session[217506]: Disconnected from invalid user mc 200.155.38.219 port 18151 [preauth]
Dec 08 20:18:29 compute-0 podman[202017]: time="2025-12-08T20:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:18:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23776 "" "Go-http-client/1.1"
Dec 08 20:18:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3908 "" "Go-http-client/1.1"
Dec 08 20:18:29 compute-0 nova_compute[187787]: 2025-12-08 20:18:29.783 187791 DEBUG nova.network.neutron [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updated VIF entry in instance network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:18:29 compute-0 nova_compute[187787]: 2025-12-08 20:18:29.785 187791 DEBUG nova.network.neutron [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updating instance_info_cache with network_info: [{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:29 compute-0 nova_compute[187787]: 2025-12-08 20:18:29.810 187791 DEBUG oslo_concurrency.lockutils [req-0594783f-4ad1-4be3-96a1-2d6f28e9a58c req-7ad9b105-179b-415a-8692-541985830d57 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:31 compute-0 nova_compute[187787]: 2025-12-08 20:18:31.105 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: ERROR   20:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: ERROR   20:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: ERROR   20:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: ERROR   20:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: ERROR   20:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:18:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:18:32 compute-0 nova_compute[187787]: 2025-12-08 20:18:32.476 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:34 compute-0 podman[217525]: 2025-12-08 20:18:34.499883397 +0000 UTC m=+0.061777713 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 08 20:18:35 compute-0 ovn_controller[96170]: 2025-12-08T20:18:35Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e8:9e:29 10.100.0.13
Dec 08 20:18:35 compute-0 ovn_controller[96170]: 2025-12-08T20:18:35Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e8:9e:29 10.100.0.13
Dec 08 20:18:36 compute-0 nova_compute[187787]: 2025-12-08 20:18:36.108 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:37 compute-0 nova_compute[187787]: 2025-12-08 20:18:37.478 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:40 compute-0 podman[217545]: 2025-12-08 20:18:40.52724345 +0000 UTC m=+0.093263563 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 08 20:18:41 compute-0 nova_compute[187787]: 2025-12-08 20:18:41.111 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:42 compute-0 nova_compute[187787]: 2025-12-08 20:18:42.481 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:42 compute-0 nova_compute[187787]: 2025-12-08 20:18:42.517 187791 INFO nova.compute.manager [None req-083fdd91-387d-47f7-b304-c52f70977a0c 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Get console output
Dec 08 20:18:42 compute-0 nova_compute[187787]: 2025-12-08 20:18:42.657 214425 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 08 20:18:43 compute-0 podman[217567]: 2025-12-08 20:18:43.526531604 +0000 UTC m=+0.083204991 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:18:43 compute-0 podman[217568]: 2025-12-08 20:18:43.52803707 +0000 UTC m=+0.077859253 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 08 20:18:44 compute-0 nova_compute[187787]: 2025-12-08 20:18:44.376 187791 DEBUG nova.compute.manager [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:44 compute-0 nova_compute[187787]: 2025-12-08 20:18:44.377 187791 DEBUG nova.compute.manager [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing instance network info cache due to event network-changed-ea23d441-1529-4558-b8e2-b0240af97aef. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:18:44 compute-0 nova_compute[187787]: 2025-12-08 20:18:44.377 187791 DEBUG oslo_concurrency.lockutils [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:44 compute-0 nova_compute[187787]: 2025-12-08 20:18:44.377 187791 DEBUG oslo_concurrency.lockutils [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:44 compute-0 nova_compute[187787]: 2025-12-08 20:18:44.378 187791 DEBUG nova.network.neutron [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Refreshing network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:18:46 compute-0 nova_compute[187787]: 2025-12-08 20:18:45.999 187791 DEBUG nova.network.neutron [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updated VIF entry in instance network info cache for port ea23d441-1529-4558-b8e2-b0240af97aef. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:18:46 compute-0 nova_compute[187787]: 2025-12-08 20:18:46.000 187791 DEBUG nova.network.neutron [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updating instance_info_cache with network_info: [{"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:46 compute-0 nova_compute[187787]: 2025-12-08 20:18:46.036 187791 DEBUG oslo_concurrency.lockutils [req-a2104c74-fb5c-412a-be8e-a9bd98c32f0d req-ff92853d-2f78-43f1-83a9-514caa034ea7 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:46 compute-0 nova_compute[187787]: 2025-12-08 20:18:46.114 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:47 compute-0 nova_compute[187787]: 2025-12-08 20:18:47.484 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:48 compute-0 nova_compute[187787]: 2025-12-08 20:18:48.082 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:18:48 compute-0 nova_compute[187787]: 2025-12-08 20:18:48.109 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Triggering sync for uuid 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Dec 08 20:18:48 compute-0 nova_compute[187787]: 2025-12-08 20:18:48.109 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:48 compute-0 nova_compute[187787]: 2025-12-08 20:18:48.110 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:48 compute-0 nova_compute[187787]: 2025-12-08 20:18:48.158 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.145 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.325 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.325 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.353 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.457 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.457 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.466 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.467 187791 INFO nova.compute.claims [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Claim successful on node compute-0.ctlplane.example.com
Dec 08 20:18:51 compute-0 podman[217615]: 2025-12-08 20:18:51.504915444 +0000 UTC m=+0.066910993 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:18:51 compute-0 podman[217614]: 2025-12-08 20:18:51.54495087 +0000 UTC m=+0.107711283 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.666 187791 DEBUG nova.compute.provider_tree [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.690 187791 DEBUG nova.scheduler.client.report [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.724 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.727 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.786 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.786 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.810 187791 INFO nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.834 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Dec 08 20:18:51 compute-0 sshd-session[217610]: Connection closed by 45.78.228.32 port 45908 [preauth]
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.934 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.937 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.938 187791 INFO nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Creating image(s)
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.939 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.939 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.941 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:51 compute-0 nova_compute[187787]: 2025-12-08 20:18:51.969 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.031 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.035 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.037 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.054 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.104 187791 DEBUG nova.policy [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '87606be528da4d588b06cc2635781b15', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.114 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.115 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.156 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac,backing_fmt=raw /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.158 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "2c120b1c2b26a18d3cbeffa85093758d4c027fac" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.159 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.221 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/2c120b1c2b26a18d3cbeffa85093758d4c027fac --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.223 187791 DEBUG nova.virt.disk.api [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Checking if we can resize image /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.224 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.279 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.280 187791 DEBUG nova.virt.disk.api [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Cannot resize image /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.281 187791 DEBUG nova.objects.instance [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'migration_context' on Instance uuid 256c2381-1a08-4481-b6bd-63b36b87d4d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.303 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.304 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Ensure instance console log exists: /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.305 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.305 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.306 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:52 compute-0 nova_compute[187787]: 2025-12-08 20:18:52.488 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:53 compute-0 nova_compute[187787]: 2025-12-08 20:18:53.269 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Successfully created port: beabb6a9-e00a-4030-9cc5-7dd568f33bde _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Dec 08 20:18:53 compute-0 podman[217673]: 2025-12-08 20:18:53.518764805 +0000 UTC m=+0.075337905 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.257 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Successfully updated port: beabb6a9-e00a-4030-9cc5-7dd568f33bde _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.279 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.279 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquired lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.279 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.409 187791 DEBUG nova.compute.manager [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-changed-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.410 187791 DEBUG nova.compute.manager [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Refreshing instance network info cache due to event network-changed-beabb6a9-e00a-4030-9cc5-7dd568f33bde. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.410 187791 DEBUG oslo_concurrency.lockutils [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:18:54 compute-0 nova_compute[187787]: 2025-12-08 20:18:54.474 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Dec 08 20:18:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:54.991 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:54.992 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:54.993 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.326 187791 DEBUG nova.network.neutron [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updating instance_info_cache with network_info: [{"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.358 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Releasing lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.359 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Instance network_info: |[{"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.360 187791 DEBUG oslo_concurrency.lockutils [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.361 187791 DEBUG nova.network.neutron [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Refreshing network info cache for port beabb6a9-e00a-4030-9cc5-7dd568f33bde _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.367 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Start _get_guest_xml network_info=[{"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'size': 0, 'encryption_secret_uuid': None, 'device_type': 'disk', 'guest_format': None, 'encrypted': False, 'encryption_format': None, 'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'image_id': 'ffae60d8-1843-4b3a-9d11-b077095cedb9'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.375 187791 WARNING nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.383 187791 DEBUG nova.virt.libvirt.host [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.384 187791 DEBUG nova.virt.libvirt.host [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.397 187791 DEBUG nova.virt.libvirt.host [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.398 187791 DEBUG nova.virt.libvirt.host [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.399 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.399 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-08T20:13:38Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='2f15909f-e95c-4c15-b311-ac90858a554d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-08T20:13:40Z,direct_url=<?>,disk_format='qcow2',id=ffae60d8-1843-4b3a-9d11-b077095cedb9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='aeda4e9ec2bc42cf85eb51bfa0b2ae46',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-08T20:13:42Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.400 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.401 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.401 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.402 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.402 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.403 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.403 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.404 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.404 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.405 187791 DEBUG nova.virt.hardware [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.415 187791 DEBUG nova.virt.libvirt.vif [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1304981188',display_name='tempest-TestNetworkBasicOps-server-1304981188',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1304981188',id=8,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBES18eAtKe05A8/U0rym1IuSCanZGlJWv2Hi4ExvNIy+TaC+PRVLWsJq2lio2gCxwu3+KUahAg7F3QKMMF0bm0eOE+UqOSNrztjiWEJTT7oaan2qWxjg+hOF1VDU5fuj4w==',key_name='tempest-TestNetworkBasicOps-302579618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-qdd04jb8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:18:51Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=256c2381-1a08-4481-b6bd-63b36b87d4d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.416 187791 DEBUG nova.network.os_vif_util [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.417 187791 DEBUG nova.network.os_vif_util [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.419 187791 DEBUG nova.objects.instance [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'pci_devices' on Instance uuid 256c2381-1a08-4481-b6bd-63b36b87d4d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.456 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] End _get_guest_xml xml=<domain type="kvm">
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <uuid>256c2381-1a08-4481-b6bd-63b36b87d4d9</uuid>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <name>instance-00000008</name>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <memory>131072</memory>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <vcpu>1</vcpu>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <metadata>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:name>tempest-TestNetworkBasicOps-server-1304981188</nova:name>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:creationTime>2025-12-08 20:18:55</nova:creationTime>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:flavor name="m1.nano">
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:memory>128</nova:memory>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:disk>1</nova:disk>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:swap>0</nova:swap>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:ephemeral>0</nova:ephemeral>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:vcpus>1</nova:vcpus>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       </nova:flavor>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:owner>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:user uuid="87606be528da4d588b06cc2635781b15">tempest-TestNetworkBasicOps-827289320-project-member</nova:user>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:project uuid="1eb0a13dd3f749b583ab1cf652d42ead">tempest-TestNetworkBasicOps-827289320</nova:project>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       </nova:owner>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:root type="image" uuid="ffae60d8-1843-4b3a-9d11-b077095cedb9"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <nova:ports>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         <nova:port uuid="beabb6a9-e00a-4030-9cc5-7dd568f33bde">
Dec 08 20:18:55 compute-0 nova_compute[187787]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:         </nova:port>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       </nova:ports>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </nova:instance>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </metadata>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <sysinfo type="smbios">
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <system>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="manufacturer">RDO</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="product">OpenStack Compute</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="serial">256c2381-1a08-4481-b6bd-63b36b87d4d9</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="uuid">256c2381-1a08-4481-b6bd-63b36b87d4d9</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <entry name="family">Virtual Machine</entry>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </system>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </sysinfo>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <os>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <type arch="x86_64" machine="q35">hvm</type>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <boot dev="hd"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <smbios mode="sysinfo"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </os>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <features>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <acpi/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <apic/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <vmcoreinfo/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </features>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <clock offset="utc">
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <timer name="pit" tickpolicy="delay"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <timer name="rtc" tickpolicy="catchup"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <timer name="hpet" present="no"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </clock>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <cpu mode="host-model" match="exact">
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <topology sockets="1" cores="1" threads="1"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </cpu>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   <devices>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <disk type="file" device="disk">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <driver name="qemu" type="qcow2" cache="none"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <target dev="vda" bus="virtio"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <disk type="file" device="cdrom">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <driver name="qemu" type="raw" cache="none"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <source file="/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.config"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <target dev="sda" bus="sata"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </disk>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <interface type="ethernet">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <mac address="fa:16:3e:cc:69:a8"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <driver name="vhost" rx_queue_size="512"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <mtu size="1442"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <target dev="tapbeabb6a9-e0"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </interface>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <serial type="pty">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <log file="/var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/console.log" append="off"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </serial>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <video>
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <model type="virtio"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </video>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <input type="tablet" bus="usb"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <rng model="virtio">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <backend model="random">/dev/urandom</backend>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </rng>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="pci" model="pcie-root-port"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <controller type="usb" index="0"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     <memballoon model="virtio">
Dec 08 20:18:55 compute-0 nova_compute[187787]:       <stats period="10"/>
Dec 08 20:18:55 compute-0 nova_compute[187787]:     </memballoon>
Dec 08 20:18:55 compute-0 nova_compute[187787]:   </devices>
Dec 08 20:18:55 compute-0 nova_compute[187787]: </domain>
Dec 08 20:18:55 compute-0 nova_compute[187787]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.458 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Preparing to wait for external event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.459 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.460 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.461 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.463 187791 DEBUG nova.virt.libvirt.vif [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-08T20:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1304981188',display_name='tempest-TestNetworkBasicOps-server-1304981188',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1304981188',id=8,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBES18eAtKe05A8/U0rym1IuSCanZGlJWv2Hi4ExvNIy+TaC+PRVLWsJq2lio2gCxwu3+KUahAg7F3QKMMF0bm0eOE+UqOSNrztjiWEJTT7oaan2qWxjg+hOF1VDU5fuj4w==',key_name='tempest-TestNetworkBasicOps-302579618',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-qdd04jb8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-08T20:18:51Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=256c2381-1a08-4481-b6bd-63b36b87d4d9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.463 187791 DEBUG nova.network.os_vif_util [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.465 187791 DEBUG nova.network.os_vif_util [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.468 187791 DEBUG os_vif [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.470 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.471 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.473 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.482 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.483 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbeabb6a9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.484 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbeabb6a9-e0, col_values=(('external_ids', {'iface-id': 'beabb6a9-e00a-4030-9cc5-7dd568f33bde', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cc:69:a8', 'vm-uuid': '256c2381-1a08-4481-b6bd-63b36b87d4d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.487 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:55 compute-0 NetworkManager[56229]: <info>  [1765225135.4878] manager: (tapbeabb6a9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.490 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.500 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.502 187791 INFO os_vif [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0')
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.589 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.589 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.590 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] No VIF found with MAC fa:16:3e:cc:69:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Dec 08 20:18:55 compute-0 nova_compute[187787]: 2025-12-08 20:18:55.590 187791 INFO nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Using config drive
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.022 187791 INFO nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Creating config drive at /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.config
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.028 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr_9fufr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.147 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.162 187791 DEBUG oslo_concurrency.processutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppr_9fufr" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:18:56 compute-0 kernel: tapbeabb6a9-e0: entered promiscuous mode
Dec 08 20:18:56 compute-0 NetworkManager[56229]: <info>  [1765225136.2441] manager: (tapbeabb6a9-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.246 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 ovn_controller[96170]: 2025-12-08T20:18:56Z|00108|binding|INFO|Claiming lport beabb6a9-e00a-4030-9cc5-7dd568f33bde for this chassis.
Dec 08 20:18:56 compute-0 ovn_controller[96170]: 2025-12-08T20:18:56Z|00109|binding|INFO|beabb6a9-e00a-4030-9cc5-7dd568f33bde: Claiming fa:16:3e:cc:69:a8 10.100.0.14
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.249 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.263 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:69:a8 10.100.0.14'], port_security=['fa:16:3e:cc:69:a8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '256c2381-1a08-4481-b6bd-63b36b87d4d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89a3e5bc-6928-489f-879e-9016cdae8e36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7ca893d5-586b-45b7-a768-5eae26555421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11170bf2-0a3b-4c62-b0d3-3f66ab038b6b, chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=beabb6a9-e00a-4030-9cc5-7dd568f33bde) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:18:56 compute-0 ovn_controller[96170]: 2025-12-08T20:18:56Z|00110|binding|INFO|Setting lport beabb6a9-e00a-4030-9cc5-7dd568f33bde up in Southbound
Dec 08 20:18:56 compute-0 ovn_controller[96170]: 2025-12-08T20:18:56Z|00111|binding|INFO|Setting lport beabb6a9-e00a-4030-9cc5-7dd568f33bde ovn-installed in OVS
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.263 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.265 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.265 105024 INFO neutron.agent.ovn.metadata.agent [-] Port beabb6a9-e00a-4030-9cc5-7dd568f33bde in datapath 89a3e5bc-6928-489f-879e-9016cdae8e36 bound to our chassis
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.266 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89a3e5bc-6928-489f-879e-9016cdae8e36
Dec 08 20:18:56 compute-0 systemd-udevd[217716]: Network interface NamePolicy= disabled on kernel command line.
Dec 08 20:18:56 compute-0 systemd-machined[154122]: New machine qemu-9-instance-00000008.
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.295 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[13bf110a-c54a-4946-8acb-27eb7acc01cf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 NetworkManager[56229]: <info>  [1765225136.3022] device (tapbeabb6a9-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 08 20:18:56 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-00000008.
Dec 08 20:18:56 compute-0 NetworkManager[56229]: <info>  [1765225136.3034] device (tapbeabb6a9-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.333 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[65172b98-ea74-4f00-9854-b283596904c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.336 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7191ae-54b4-484c-a864-010b96b00c21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.367 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[6aaf036b-9b58-4c04-9180-7a622cd3224d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.389 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0a738e-d079-455a-838b-41ac5e99d79a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89a3e5bc-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:e4:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364101, 'reachable_time': 42907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217729, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.410 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[69ca9217-c436-4771-b813-23f15d7198d7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap89a3e5bc-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364118, 'tstamp': 364118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217731, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap89a3e5bc-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364123, 'tstamp': 364123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217731, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.412 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89a3e5bc-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.414 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.416 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89a3e5bc-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.416 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.416 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89a3e5bc-60, col_values=(('external_ids', {'iface-id': 'b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:18:56 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:18:56.417 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.654 187791 DEBUG nova.compute.manager [req-69183911-d2ef-436e-bca2-113706b369f7 req-46a6e39d-8567-4401-a9a4-ce27cc8f1082 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.655 187791 DEBUG oslo_concurrency.lockutils [req-69183911-d2ef-436e-bca2-113706b369f7 req-46a6e39d-8567-4401-a9a4-ce27cc8f1082 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.655 187791 DEBUG oslo_concurrency.lockutils [req-69183911-d2ef-436e-bca2-113706b369f7 req-46a6e39d-8567-4401-a9a4-ce27cc8f1082 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.656 187791 DEBUG oslo_concurrency.lockutils [req-69183911-d2ef-436e-bca2-113706b369f7 req-46a6e39d-8567-4401-a9a4-ce27cc8f1082 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.656 187791 DEBUG nova.compute.manager [req-69183911-d2ef-436e-bca2-113706b369f7 req-46a6e39d-8567-4401-a9a4-ce27cc8f1082 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Processing event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.895 187791 DEBUG nova.network.neutron [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updated VIF entry in instance network info cache for port beabb6a9-e00a-4030-9cc5-7dd568f33bde. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.896 187791 DEBUG nova.network.neutron [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updating instance_info_cache with network_info: [{"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:18:56 compute-0 nova_compute[187787]: 2025-12-08 20:18:56.923 187791 DEBUG oslo_concurrency.lockutils [req-1b2da21b-8acf-49e7-bd8a-7cbaf75184b3 req-8d0fbe32-d36a-4007-994e-36815656f8d9 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.321 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225137.3212273, 256c2381-1a08-4481-b6bd-63b36b87d4d9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.322 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] VM Started (Lifecycle Event)
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.324 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.329 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.334 187791 INFO nova.virt.libvirt.driver [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Instance spawned successfully.
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.335 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.349 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.355 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.371 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.372 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.373 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.373 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.374 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.375 187791 DEBUG nova.virt.libvirt.driver [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.396 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.397 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225137.3241305, 256c2381-1a08-4481-b6bd-63b36b87d4d9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.397 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] VM Paused (Lifecycle Event)
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.439 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.445 187791 DEBUG nova.virt.driver [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] Emitting event <LifecycleEvent: 1765225137.3289912, 256c2381-1a08-4481-b6bd-63b36b87d4d9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.445 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] VM Resumed (Lifecycle Event)
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.493 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.498 187791 DEBUG nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.513 187791 INFO nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Took 5.58 seconds to spawn the instance on the hypervisor.
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.514 187791 DEBUG nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.564 187791 INFO nova.compute.manager [None req-fd42352a-bff5-4ec0-9d00-fccc2d3f5eba - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] During sync_power_state the instance has a pending task (spawning). Skip.
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.613 187791 INFO nova.compute.manager [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Took 6.19 seconds to build instance.
Dec 08 20:18:57 compute-0 nova_compute[187787]: 2025-12-08 20:18:57.636 187791 DEBUG oslo_concurrency.lockutils [None req-c75b0d9b-f21a-4ee7-a639-39e007603ae8 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.311s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.808 187791 DEBUG nova.compute.manager [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.809 187791 DEBUG oslo_concurrency.lockutils [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.810 187791 DEBUG oslo_concurrency.lockutils [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.811 187791 DEBUG oslo_concurrency.lockutils [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.811 187791 DEBUG nova.compute.manager [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] No waiting events found dispatching network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:18:58 compute-0 nova_compute[187787]: 2025-12-08 20:18:58.812 187791 WARNING nova.compute.manager [req-333b195f-d099-4ca0-a9f5-a9a0c77fe721 req-9297e866-f215-4159-b685-a56eac02e775 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received unexpected event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde for instance with vm_state active and task_state None.
Dec 08 20:18:59 compute-0 podman[202017]: time="2025-12-08T20:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:18:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 23776 "" "Go-http-client/1.1"
Dec 08 20:18:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3922 "" "Go-http-client/1.1"
Dec 08 20:19:00 compute-0 nova_compute[187787]: 2025-12-08 20:19:00.488 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.021 187791 DEBUG nova.compute.manager [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-changed-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.021 187791 DEBUG nova.compute.manager [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Refreshing instance network info cache due to event network-changed-beabb6a9-e00a-4030-9cc5-7dd568f33bde. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.022 187791 DEBUG oslo_concurrency.lockutils [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.022 187791 DEBUG oslo_concurrency.lockutils [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquired lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.022 187791 DEBUG nova.network.neutron [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Refreshing network info cache for port beabb6a9-e00a-4030-9cc5-7dd568f33bde _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.152 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:01 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:01.301 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:19:01 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:01.304 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:19:01 compute-0 nova_compute[187787]: 2025-12-08 20:19:01.304 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: ERROR   20:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: ERROR   20:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: ERROR   20:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: ERROR   20:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: ERROR   20:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:19:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:19:02 compute-0 nova_compute[187787]: 2025-12-08 20:19:02.919 187791 DEBUG nova.network.neutron [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updated VIF entry in instance network info cache for port beabb6a9-e00a-4030-9cc5-7dd568f33bde. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Dec 08 20:19:02 compute-0 nova_compute[187787]: 2025-12-08 20:19:02.920 187791 DEBUG nova.network.neutron [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updating instance_info_cache with network_info: [{"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:19:02 compute-0 nova_compute[187787]: 2025-12-08 20:19:02.969 187791 DEBUG oslo_concurrency.lockutils [req-0a261772-0f60-42b8-ab03-8b21ff3b8e2b req-4849a037-c6ee-490a-ad21-dc93ffe61b7c 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Releasing lock "refresh_cache-256c2381-1a08-4481-b6bd-63b36b87d4d9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Dec 08 20:19:05 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:05.306 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:05 compute-0 nova_compute[187787]: 2025-12-08 20:19:05.491 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:05 compute-0 podman[217739]: 2025-12-08 20:19:05.549685491 +0000 UTC m=+0.106583667 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42)
Dec 08 20:19:06 compute-0 nova_compute[187787]: 2025-12-08 20:19:06.155 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:09 compute-0 ovn_controller[96170]: 2025-12-08T20:19:09Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:cc:69:a8 10.100.0.14
Dec 08 20:19:09 compute-0 ovn_controller[96170]: 2025-12-08T20:19:09Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:cc:69:a8 10.100.0.14
Dec 08 20:19:10 compute-0 nova_compute[187787]: 2025-12-08 20:19:10.494 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:11 compute-0 nova_compute[187787]: 2025-12-08 20:19:11.156 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:11 compute-0 podman[217773]: 2025-12-08 20:19:11.533765472 +0000 UTC m=+0.095544692 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Dec 08 20:19:14 compute-0 podman[217794]: 2025-12-08 20:19:14.532537118 +0000 UTC m=+0.094032024 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:19:14 compute-0 podman[217795]: 2025-12-08 20:19:14.548887322 +0000 UTC m=+0.102683207 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:19:15 compute-0 nova_compute[187787]: 2025-12-08 20:19:15.498 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:15 compute-0 nova_compute[187787]: 2025-12-08 20:19:15.759 187791 INFO nova.compute.manager [None req-e792db9f-3328-4cbb-bd2e-f418a26111a5 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Get console output
Dec 08 20:19:15 compute-0 nova_compute[187787]: 2025-12-08 20:19:15.769 214425 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.114 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.115 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.116 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.116 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.117 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.119 187791 INFO nova.compute.manager [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Terminating instance
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.121 187791 DEBUG nova.compute.manager [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:19:16 compute-0 kernel: tapbeabb6a9-e0 (unregistering): left promiscuous mode
Dec 08 20:19:16 compute-0 NetworkManager[56229]: <info>  [1765225156.1527] device (tapbeabb6a9-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:19:16 compute-0 ovn_controller[96170]: 2025-12-08T20:19:16Z|00112|binding|INFO|Releasing lport beabb6a9-e00a-4030-9cc5-7dd568f33bde from this chassis (sb_readonly=0)
Dec 08 20:19:16 compute-0 ovn_controller[96170]: 2025-12-08T20:19:16Z|00113|binding|INFO|Setting lport beabb6a9-e00a-4030-9cc5-7dd568f33bde down in Southbound
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.164 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 ovn_controller[96170]: 2025-12-08T20:19:16Z|00114|binding|INFO|Removing iface tapbeabb6a9-e0 ovn-installed in OVS
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.170 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.179 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cc:69:a8 10.100.0.14'], port_security=['fa:16:3e:cc:69:a8 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '256c2381-1a08-4481-b6bd-63b36b87d4d9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89a3e5bc-6928-489f-879e-9016cdae8e36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7ca893d5-586b-45b7-a768-5eae26555421', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com', 'neutron:port_fip': '192.168.122.225'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11170bf2-0a3b-4c62-b0d3-3f66ab038b6b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=beabb6a9-e00a-4030-9cc5-7dd568f33bde) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.181 105024 INFO neutron.agent.ovn.metadata.agent [-] Port beabb6a9-e00a-4030-9cc5-7dd568f33bde in datapath 89a3e5bc-6928-489f-879e-9016cdae8e36 unbound from our chassis
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.182 105024 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 89a3e5bc-6928-489f-879e-9016cdae8e36
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.186 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.203 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[84f05214-3a72-4df0-9a6d-60fc17435298]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000008.scope: Deactivated successfully.
Dec 08 20:19:16 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000008.scope: Consumed 12.979s CPU time.
Dec 08 20:19:16 compute-0 systemd-machined[154122]: Machine qemu-9-instance-00000008 terminated.
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.243 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[5d122172-6d64-4947-ba75-05cea501ba74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.247 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[351d8018-4f2a-4d17-9b92-d668b5427d29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.282 214695 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee82903-ceee-4612-a26f-f964ce9ca3ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.302 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[2abc8871-63f8-4b46-94b7-a12310ae01f0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap89a3e5bc-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:84:e4:8d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364101, 'reachable_time': 42907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217849, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.324 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[f82f3352-d1ce-4c32-8799-ba269ccb922e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap89a3e5bc-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364118, 'tstamp': 364118}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217850, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap89a3e5bc-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364123, 'tstamp': 364123}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217850, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.326 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89a3e5bc-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.328 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.335 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.336 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap89a3e5bc-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.336 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.337 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap89a3e5bc-60, col_values=(('external_ids', {'iface-id': 'b2d0a9e7-8cb1-469e-8d3a-b408b2a6ad99'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:16 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:16.338 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.412 187791 INFO nova.virt.libvirt.driver [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Instance destroyed successfully.
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.413 187791 DEBUG nova.objects.instance [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'resources' on Instance uuid 256c2381-1a08-4481-b6bd-63b36b87d4d9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.435 187791 DEBUG nova.virt.libvirt.vif [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:18:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1304981188',display_name='tempest-TestNetworkBasicOps-server-1304981188',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1304981188',id=8,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBES18eAtKe05A8/U0rym1IuSCanZGlJWv2Hi4ExvNIy+TaC+PRVLWsJq2lio2gCxwu3+KUahAg7F3QKMMF0bm0eOE+UqOSNrztjiWEJTT7oaan2qWxjg+hOF1VDU5fuj4w==',key_name='tempest-TestNetworkBasicOps-302579618',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:18:57Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-qdd04jb8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:18:57Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=256c2381-1a08-4481-b6bd-63b36b87d4d9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.436 187791 DEBUG nova.network.os_vif_util [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "address": "fa:16:3e:cc:69:a8", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.225", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbeabb6a9-e0", "ovs_interfaceid": "beabb6a9-e00a-4030-9cc5-7dd568f33bde", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.437 187791 DEBUG nova.network.os_vif_util [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.437 187791 DEBUG os_vif [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.440 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.441 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbeabb6a9-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.444 187791 DEBUG nova.compute.manager [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-unplugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.444 187791 DEBUG oslo_concurrency.lockutils [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.444 187791 DEBUG oslo_concurrency.lockutils [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.445 187791 DEBUG oslo_concurrency.lockutils [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.445 187791 DEBUG nova.compute.manager [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] No waiting events found dispatching network-vif-unplugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.445 187791 DEBUG nova.compute.manager [req-90785d18-e855-45b2-ac0d-eafef460d9df req-eb7f652c-bcc8-423f-8868-f5f87cad4b8e 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-unplugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.446 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.447 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.451 187791 INFO os_vif [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:cc:69:a8,bridge_name='br-int',has_traffic_filtering=True,id=beabb6a9-e00a-4030-9cc5-7dd568f33bde,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbeabb6a9-e0')
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.452 187791 INFO nova.virt.libvirt.driver [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Deleting instance files /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9_del
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.453 187791 INFO nova.virt.libvirt.driver [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Deletion of /var/lib/nova/instances/256c2381-1a08-4481-b6bd-63b36b87d4d9_del complete
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.536 187791 INFO nova.compute.manager [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Took 0.41 seconds to destroy the instance on the hypervisor.
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.537 187791 DEBUG oslo.service.loopingcall [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.537 187791 DEBUG nova.compute.manager [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:19:16 compute-0 nova_compute[187787]: 2025-12-08 20:19:16.537 187791 DEBUG nova.network.neutron [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.436 187791 DEBUG nova.network.neutron [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.465 187791 INFO nova.compute.manager [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Took 0.93 seconds to deallocate network for instance.
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.547 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.548 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.645 187791 DEBUG nova.compute.provider_tree [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.669 187791 DEBUG nova.scheduler.client.report [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.702 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.747 187791 INFO nova.scheduler.client.report [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Deleted allocations for instance 256c2381-1a08-4481-b6bd-63b36b87d4d9
Dec 08 20:19:17 compute-0 nova_compute[187787]: 2025-12-08 20:19:17.825 187791 DEBUG oslo_concurrency.lockutils [None req-b7690efb-f2c9-4238-b7fa-84de531e3eac 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.820 187791 DEBUG nova.compute.manager [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.821 187791 DEBUG oslo_concurrency.lockutils [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.821 187791 DEBUG oslo_concurrency.lockutils [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.822 187791 DEBUG oslo_concurrency.lockutils [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "256c2381-1a08-4481-b6bd-63b36b87d4d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.822 187791 DEBUG nova.compute.manager [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] No waiting events found dispatching network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.823 187791 WARNING nova.compute.manager [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received unexpected event network-vif-plugged-beabb6a9-e00a-4030-9cc5-7dd568f33bde for instance with vm_state deleted and task_state None.
Dec 08 20:19:18 compute-0 nova_compute[187787]: 2025-12-08 20:19:18.823 187791 DEBUG nova.compute.manager [req-32308d2b-bb19-4461-8109-7bafbd424377 req-5b3ed3e3-b8c8-458e-914e-b479f5ed5de2 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Received event network-vif-deleted-beabb6a9-e00a-4030-9cc5-7dd568f33bde external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.100 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.100 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.101 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.101 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.101 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.103 187791 INFO nova.compute.manager [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Terminating instance
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.104 187791 DEBUG nova.compute.manager [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Dec 08 20:19:19 compute-0 kernel: tapea23d441-15 (unregistering): left promiscuous mode
Dec 08 20:19:19 compute-0 NetworkManager[56229]: <info>  [1765225159.1356] device (tapea23d441-15): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.143 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 ovn_controller[96170]: 2025-12-08T20:19:19Z|00115|binding|INFO|Releasing lport ea23d441-1529-4558-b8e2-b0240af97aef from this chassis (sb_readonly=0)
Dec 08 20:19:19 compute-0 ovn_controller[96170]: 2025-12-08T20:19:19Z|00116|binding|INFO|Setting lport ea23d441-1529-4558-b8e2-b0240af97aef down in Southbound
Dec 08 20:19:19 compute-0 ovn_controller[96170]: 2025-12-08T20:19:19Z|00117|binding|INFO|Removing iface tapea23d441-15 ovn-installed in OVS
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.146 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.153 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:9e:29 10.100.0.13'], port_security=['fa:16:3e:e8:9e:29 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '65bbe4d2-0789-4405-9b37-2d5bd7b5f5af', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89a3e5bc-6928-489f-879e-9016cdae8e36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1eb0a13dd3f749b583ab1cf652d42ead', 'neutron:revision_number': '4', 'neutron:security_group_ids': '81ee238d-55fc-4374-aa3f-73da44b1a064', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11170bf2-0a3b-4c62-b0d3-3f66ab038b6b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>], logical_port=ea23d441-1529-4558-b8e2-b0240af97aef) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f4da7d88ee0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.155 105024 INFO neutron.agent.ovn.metadata.agent [-] Port ea23d441-1529-4558-b8e2-b0240af97aef in datapath 89a3e5bc-6928-489f-879e-9016cdae8e36 unbound from our chassis
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.157 105024 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89a3e5bc-6928-489f-879e-9016cdae8e36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.159 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[40e45087-f6b6-4695-a93e-83a4c69a0682]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.160 105024 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36 namespace which is not needed anymore
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.176 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 08 20:19:19 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000007.scope: Consumed 14.413s CPU time.
Dec 08 20:19:19 compute-0 systemd-machined[154122]: Machine qemu-8-instance-00000007 terminated.
Dec 08 20:19:19 compute-0 NetworkManager[56229]: <info>  [1765225159.3243] manager: (tapea23d441-15): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Dec 08 20:19:19 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [NOTICE]   (217494) : haproxy version is 2.8.14-c23fe91
Dec 08 20:19:19 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [NOTICE]   (217494) : path to executable is /usr/sbin/haproxy
Dec 08 20:19:19 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [WARNING]  (217494) : Exiting Master process...
Dec 08 20:19:19 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [ALERT]    (217494) : Current worker (217496) exited with code 143 (Terminated)
Dec 08 20:19:19 compute-0 neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36[217490]: [WARNING]  (217494) : All workers exited. Exiting... (0)
Dec 08 20:19:19 compute-0 systemd[1]: libpod-18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe.scope: Deactivated successfully.
Dec 08 20:19:19 compute-0 podman[217892]: 2025-12-08 20:19:19.352214442 +0000 UTC m=+0.070779865 container died 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.378 187791 INFO nova.virt.libvirt.driver [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Instance destroyed successfully.
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.379 187791 DEBUG nova.objects.instance [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lazy-loading 'resources' on Instance uuid 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Dec 08 20:19:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe-userdata-shm.mount: Deactivated successfully.
Dec 08 20:19:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-58bd87a7b3b7a03bbaa30f57c41cbdb29d59bdfe9895969c682f75d17567550b-merged.mount: Deactivated successfully.
Dec 08 20:19:19 compute-0 podman[217892]: 2025-12-08 20:19:19.391492866 +0000 UTC m=+0.110058289 container cleanup 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.396 187791 DEBUG nova.virt.libvirt.vif [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-08T20:18:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-260707184',display_name='tempest-TestNetworkBasicOps-server-260707184',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-260707184',id=7,image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLd2cSTEUvFxgsV2ieVQZVMjc1Ieym1ixwtjsuBGMy7n7mrLOVqiBR013Z9hdxuXV9gHdwd006mZ3wW76WoWGANP40iQpeRarJ9oX8affq75YCUFwMGk1QpFwTF1rnEDIw==',key_name='tempest-TestNetworkBasicOps-2036254642',keypairs=<?>,launch_index=0,launched_at=2025-12-08T20:18:23Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1eb0a13dd3f749b583ab1cf652d42ead',ramdisk_id='',reservation_id='r-le16w1u1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ffae60d8-1843-4b3a-9d11-b077095cedb9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-827289320',owner_user_name='tempest-TestNetworkBasicOps-827289320-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-08T20:18:23Z,user_data=None,user_id='87606be528da4d588b06cc2635781b15',uuid=65bbe4d2-0789-4405-9b37-2d5bd7b5f5af,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.397 187791 DEBUG nova.network.os_vif_util [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converting VIF {"id": "ea23d441-1529-4558-b8e2-b0240af97aef", "address": "fa:16:3e:e8:9e:29", "network": {"id": "89a3e5bc-6928-489f-879e-9016cdae8e36", "bridge": "br-int", "label": "tempest-network-smoke--1765383976", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1eb0a13dd3f749b583ab1cf652d42ead", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapea23d441-15", "ovs_interfaceid": "ea23d441-1529-4558-b8e2-b0240af97aef", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.398 187791 DEBUG nova.network.os_vif_util [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.398 187791 DEBUG os_vif [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Dec 08 20:19:19 compute-0 systemd[1]: libpod-conmon-18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe.scope: Deactivated successfully.
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.400 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.400 187791 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapea23d441-15, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.402 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.404 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.407 187791 INFO os_vif [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e8:9e:29,bridge_name='br-int',has_traffic_filtering=True,id=ea23d441-1529-4558-b8e2-b0240af97aef,network=Network(89a3e5bc-6928-489f-879e-9016cdae8e36),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapea23d441-15')
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.408 187791 INFO nova.virt.libvirt.driver [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Deleting instance files /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af_del
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.409 187791 INFO nova.virt.libvirt.driver [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Deletion of /var/lib/nova/instances/65bbe4d2-0789-4405-9b37-2d5bd7b5f5af_del complete
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.466 187791 INFO nova.compute.manager [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Took 0.36 seconds to destroy the instance on the hypervisor.
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.467 187791 DEBUG oslo.service.loopingcall [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.467 187791 DEBUG nova.compute.manager [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.468 187791 DEBUG nova.network.neutron [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Dec 08 20:19:19 compute-0 podman[217938]: 2025-12-08 20:19:19.470035683 +0000 UTC m=+0.052229152 container remove 18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.476 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[11157d9d-9aa2-4b6c-a505-6dfca03ea81f]: (4, ('Mon Dec  8 08:19:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36 (18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe)\n18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe\nMon Dec  8 08:19:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36 (18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe)\n18c49739c2f5207c71891ae76d62ead82dbc7d1659bb5bf00fc52841ddfcc4fe\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.478 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[19793496-b61c-46ae-b78e-056dd75df373]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.479 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap89a3e5bc-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.481 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 kernel: tap89a3e5bc-60: left promiscuous mode
Dec 08 20:19:19 compute-0 nova_compute[187787]: 2025-12-08 20:19:19.494 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.498 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[b79fe57c-fe2e-47eb-9338-01798655d612]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.517 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[5b00389a-3782-4cf2-a854-4edf0aa4691c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.519 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[7bbb5446-2e9d-4f60-aaca-69530288e2e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.544 214668 DEBUG oslo.privsep.daemon [-] privsep: reply[eff27416-987a-4c05-8b94-3d602597db3f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364093, 'reachable_time': 20911, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217954, 'error': None, 'target': 'ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.548 105136 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-89a3e5bc-6928-489f-879e-9016cdae8e36 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Dec 08 20:19:19 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:19.548 105136 DEBUG oslo.privsep.daemon [-] privsep: reply[e38bbadb-d692-45aa-ac7a-9ef05378e283]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Dec 08 20:19:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d89a3e5bc\x2d6928\x2d489f\x2d879e\x2d9016cdae8e36.mount: Deactivated successfully.
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.836 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.837 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.838 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.854 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.854 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.854 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.855 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:19:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.199 187791 DEBUG nova.network.neutron [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.218 187791 INFO nova.compute.manager [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Took 0.75 seconds to deallocate network for instance.
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.274 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.275 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.379 187791 DEBUG nova.compute.provider_tree [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.399 187791 DEBUG nova.scheduler.client.report [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.441 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.483 187791 INFO nova.scheduler.client.report [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Deleted allocations for instance 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.589 187791 DEBUG oslo_concurrency.lockutils [None req-a79be167-e7ef-4ed9-b02f-87ec58037255 87606be528da4d588b06cc2635781b15 1eb0a13dd3f749b583ab1cf652d42ead - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.942 187791 DEBUG nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-vif-unplugged-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.943 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.944 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.944 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.945 187791 DEBUG nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] No waiting events found dispatching network-vif-unplugged-ea23d441-1529-4558-b8e2-b0240af97aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.946 187791 WARNING nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received unexpected event network-vif-unplugged-ea23d441-1529-4558-b8e2-b0240af97aef for instance with vm_state deleted and task_state None.
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.946 187791 DEBUG nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.947 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Acquiring lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.948 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.948 187791 DEBUG oslo_concurrency.lockutils [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] Lock "65bbe4d2-0789-4405-9b37-2d5bd7b5f5af-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.949 187791 DEBUG nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] No waiting events found dispatching network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.950 187791 WARNING nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received unexpected event network-vif-plugged-ea23d441-1529-4558-b8e2-b0240af97aef for instance with vm_state deleted and task_state None.
Dec 08 20:19:20 compute-0 nova_compute[187787]: 2025-12-08 20:19:20.951 187791 DEBUG nova.compute.manager [req-b6ca0d56-1cd5-45fd-b7b5-6a4f2e848a17 req-fab87745-6fd2-4ce0-99ef-a803f93d22c3 073be92b248c4c32b0574828246bd097 04622427e32547659bffcd2f62034120 - - default default] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Received event network-vif-deleted-ea23d441-1529-4558-b8e2-b0240af97aef external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Dec 08 20:19:21 compute-0 nova_compute[187787]: 2025-12-08 20:19:21.187 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:21 compute-0 nova_compute[187787]: 2025-12-08 20:19:21.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:21 compute-0 nova_compute[187787]: 2025-12-08 20:19:21.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:21 compute-0 nova_compute[187787]: 2025-12-08 20:19:21.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:19:22 compute-0 podman[217957]: 2025-12-08 20:19:22.563504522 +0000 UTC m=+0.117675107 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 08 20:19:22 compute-0 podman[217956]: 2025-12-08 20:19:22.596101566 +0000 UTC m=+0.150267221 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:19:23 compute-0 nova_compute[187787]: 2025-12-08 20:19:23.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:23 compute-0 nova_compute[187787]: 2025-12-08 20:19:23.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:19:23 compute-0 nova_compute[187787]: 2025-12-08 20:19:23.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:19:23 compute-0 nova_compute[187787]: 2025-12-08 20:19:23.797 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:19:24 compute-0 nova_compute[187787]: 2025-12-08 20:19:24.076 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:24 compute-0 nova_compute[187787]: 2025-12-08 20:19:24.144 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:24 compute-0 nova_compute[187787]: 2025-12-08 20:19:24.403 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:24 compute-0 podman[218003]: 2025-12-08 20:19:24.521487035 +0000 UTC m=+0.080600462 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:19:24 compute-0 nova_compute[187787]: 2025-12-08 20:19:24.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:24 compute-0 nova_compute[187787]: 2025-12-08 20:19:24.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.809 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.810 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.810 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:25 compute-0 nova_compute[187787]: 2025-12-08 20:19:25.810 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.039 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.041 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5676MB free_disk=72.87684631347656GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.042 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.042 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.117 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.118 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.147 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.166 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.192 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.192 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.150s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:26 compute-0 nova_compute[187787]: 2025-12-08 20:19:26.228 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:29 compute-0 nova_compute[187787]: 2025-12-08 20:19:29.405 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:29 compute-0 podman[202017]: time="2025-12-08T20:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:19:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:19:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Dec 08 20:19:31 compute-0 nova_compute[187787]: 2025-12-08 20:19:31.230 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:31 compute-0 nova_compute[187787]: 2025-12-08 20:19:31.410 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765225156.409126, 256c2381-1a08-4481-b6bd-63b36b87d4d9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:19:31 compute-0 nova_compute[187787]: 2025-12-08 20:19:31.411 187791 INFO nova.compute.manager [-] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] VM Stopped (Lifecycle Event)
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: ERROR   20:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: ERROR   20:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: ERROR   20:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: ERROR   20:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: ERROR   20:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:19:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:19:31 compute-0 nova_compute[187787]: 2025-12-08 20:19:31.672 187791 DEBUG nova.compute.manager [None req-3df52085-2f5e-4d4e-bf1f-532891c8d05a - - - - - -] [instance: 256c2381-1a08-4481-b6bd-63b36b87d4d9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:19:32 compute-0 sshd-session[218029]: Received disconnect from 45.174.162.68 port 24741:11: Bye Bye [preauth]
Dec 08 20:19:32 compute-0 sshd-session[218029]: Disconnected from authenticating user root 45.174.162.68 port 24741 [preauth]
Dec 08 20:19:34 compute-0 nova_compute[187787]: 2025-12-08 20:19:34.375 187791 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765225159.374065, 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Dec 08 20:19:34 compute-0 nova_compute[187787]: 2025-12-08 20:19:34.376 187791 INFO nova.compute.manager [-] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] VM Stopped (Lifecycle Event)
Dec 08 20:19:34 compute-0 nova_compute[187787]: 2025-12-08 20:19:34.406 187791 DEBUG nova.compute.manager [None req-1fa20960-0d83-4248-abef-1b2c31485e9c - - - - - -] [instance: 65bbe4d2-0789-4405-9b37-2d5bd7b5f5af] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Dec 08 20:19:34 compute-0 nova_compute[187787]: 2025-12-08 20:19:34.407 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:36 compute-0 nova_compute[187787]: 2025-12-08 20:19:36.234 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:36 compute-0 podman[218031]: 2025-12-08 20:19:36.555045126 +0000 UTC m=+0.104741202 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Dec 08 20:19:39 compute-0 nova_compute[187787]: 2025-12-08 20:19:39.409 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:41 compute-0 nova_compute[187787]: 2025-12-08 20:19:41.237 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:42 compute-0 podman[218051]: 2025-12-08 20:19:42.529961226 +0000 UTC m=+0.093965603 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 08 20:19:44 compute-0 nova_compute[187787]: 2025-12-08 20:19:44.411 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:44 compute-0 podman[218073]: 2025-12-08 20:19:44.753069388 +0000 UTC m=+0.068136912 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 08 20:19:44 compute-0 podman[218072]: 2025-12-08 20:19:44.75345956 +0000 UTC m=+0.078480036 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:19:46 compute-0 nova_compute[187787]: 2025-12-08 20:19:46.240 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:47 compute-0 sshd-session[218115]: Received disconnect from 47.76.127.165 port 57032:11: Bye Bye [preauth]
Dec 08 20:19:47 compute-0 sshd-session[218115]: Disconnected from authenticating user root 47.76.127.165 port 57032 [preauth]
Dec 08 20:19:49 compute-0 nova_compute[187787]: 2025-12-08 20:19:49.413 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:51 compute-0 nova_compute[187787]: 2025-12-08 20:19:51.278 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:53 compute-0 podman[218119]: 2025-12-08 20:19:53.524378229 +0000 UTC m=+0.078107284 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:19:53 compute-0 podman[218118]: 2025-12-08 20:19:53.547179585 +0000 UTC m=+0.113911269 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:19:54 compute-0 nova_compute[187787]: 2025-12-08 20:19:54.414 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:54 compute-0 ovn_controller[96170]: 2025-12-08T20:19:54Z|00118|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Dec 08 20:19:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:54.994 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:19:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:54.995 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:19:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:19:54.995 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:19:55 compute-0 podman[218164]: 2025-12-08 20:19:55.517143915 +0000 UTC m=+0.082748061 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:19:56 compute-0 nova_compute[187787]: 2025-12-08 20:19:56.320 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:59 compute-0 nova_compute[187787]: 2025-12-08 20:19:59.418 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:19:59 compute-0 podman[202017]: time="2025-12-08T20:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:19:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:19:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3459 "" "Go-http-client/1.1"
Dec 08 20:20:01 compute-0 nova_compute[187787]: 2025-12-08 20:20:01.322 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: ERROR   20:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: ERROR   20:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: ERROR   20:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: ERROR   20:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: ERROR   20:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:20:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:20:04 compute-0 nova_compute[187787]: 2025-12-08 20:20:04.420 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:06 compute-0 nova_compute[187787]: 2025-12-08 20:20:06.325 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:07 compute-0 podman[218191]: 2025-12-08 20:20:07.496759572 +0000 UTC m=+0.070038371 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Dec 08 20:20:09 compute-0 nova_compute[187787]: 2025-12-08 20:20:09.422 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:10 compute-0 sshd-session[218189]: ssh_dispatch_run_fatal: Connection from 222.172.32.246 port 2182: Connection timed out [preauth]
Dec 08 20:20:11 compute-0 nova_compute[187787]: 2025-12-08 20:20:11.328 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:13 compute-0 podman[218211]: 2025-12-08 20:20:13.530790279 +0000 UTC m=+0.100833059 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 08 20:20:14 compute-0 nova_compute[187787]: 2025-12-08 20:20:14.424 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:14 compute-0 sshd-session[218232]: Invalid user httpd from 200.155.38.219 port 32474
Dec 08 20:20:14 compute-0 sshd-session[218232]: Received disconnect from 200.155.38.219 port 32474:11: Bye Bye [preauth]
Dec 08 20:20:14 compute-0 sshd-session[218232]: Disconnected from invalid user httpd 200.155.38.219 port 32474 [preauth]
Dec 08 20:20:15 compute-0 podman[218235]: 2025-12-08 20:20:15.525211777 +0000 UTC m=+0.074757989 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:20:15 compute-0 podman[218234]: 2025-12-08 20:20:15.532948169 +0000 UTC m=+0.098351630 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:20:16 compute-0 nova_compute[187787]: 2025-12-08 20:20:16.369 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:19 compute-0 sshd-session[218274]: Accepted publickey for zuul from 192.168.122.10 port 56034 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:20:19 compute-0 systemd-logind[793]: New session 26 of user zuul.
Dec 08 20:20:19 compute-0 systemd[1]: Started Session 26 of User zuul.
Dec 08 20:20:19 compute-0 sshd-session[218274]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:20:19 compute-0 sudo[218278]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 08 20:20:19 compute-0 sudo[218278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:20:19 compute-0 nova_compute[187787]: 2025-12-08 20:20:19.426 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:21 compute-0 nova_compute[187787]: 2025-12-08 20:20:21.412 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:22 compute-0 nova_compute[187787]: 2025-12-08 20:20:22.189 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:23 compute-0 nova_compute[187787]: 2025-12-08 20:20:23.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:23 compute-0 nova_compute[187787]: 2025-12-08 20:20:23.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:20:24 compute-0 ovs-vsctl[218448]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.428 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:24 compute-0 podman[218450]: 2025-12-08 20:20:24.506566183 +0000 UTC m=+0.069884786 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 08 20:20:24 compute-0 podman[218449]: 2025-12-08 20:20:24.54212591 +0000 UTC m=+0.104553206 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.911 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:20:24 compute-0 nova_compute[187787]: 2025-12-08 20:20:24.911 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:25 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 218302 (sos)
Dec 08 20:20:25 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 08 20:20:25 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 08 20:20:25 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 08 20:20:25 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 08 20:20:25 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 08 20:20:25 compute-0 nova_compute[187787]: 2025-12-08 20:20:25.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:26 compute-0 podman[218707]: 2025-12-08 20:20:26.05908084 +0000 UTC m=+0.085627511 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.411 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:26 compute-0 crontab[218938]: (root) LIST (root)
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.823 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.823 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.824 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.824 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.969 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.971 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5471MB free_disk=72.85136413574219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.971 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:20:26 compute-0 nova_compute[187787]: 2025-12-08 20:20:26.971 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.063 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.064 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.110 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing inventories for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.139 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating ProviderTree inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.140 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.166 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing aggregate associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.202 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing trait associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_FMA3,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.242 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.264 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.267 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:20:27 compute-0 nova_compute[187787]: 2025-12-08 20:20:27.268 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.296s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:20:28 compute-0 nova_compute[187787]: 2025-12-08 20:20:28.268 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:28 compute-0 nova_compute[187787]: 2025-12-08 20:20:28.269 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:29 compute-0 systemd[1]: Starting Hostname Service...
Dec 08 20:20:29 compute-0 systemd[1]: Started Hostname Service.
Dec 08 20:20:29 compute-0 nova_compute[187787]: 2025-12-08 20:20:29.462 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:29 compute-0 podman[202017]: time="2025-12-08T20:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:20:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:20:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3469 "" "Go-http-client/1.1"
Dec 08 20:20:29 compute-0 nova_compute[187787]: 2025-12-08 20:20:29.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:20:31 compute-0 nova_compute[187787]: 2025-12-08 20:20:31.421 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: ERROR   20:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: ERROR   20:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: ERROR   20:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: ERROR   20:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: ERROR   20:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:20:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:20:34 compute-0 nova_compute[187787]: 2025-12-08 20:20:34.520 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck2507952480-merged.mount: Deactivated successfully.
Dec 08 20:20:36 compute-0 nova_compute[187787]: 2025-12-08 20:20:36.468 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:36 compute-0 ovs-appctl[220234]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 08 20:20:36 compute-0 ovs-appctl[220238]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 08 20:20:36 compute-0 ovs-appctl[220245]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 08 20:20:38 compute-0 podman[220763]: 2025-12-08 20:20:38.521254225 +0000 UTC m=+0.084644310 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.build-date=20251125)
Dec 08 20:20:39 compute-0 nova_compute[187787]: 2025-12-08 20:20:39.524 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:41 compute-0 nova_compute[187787]: 2025-12-08 20:20:41.470 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:42 compute-0 sshd-session[217666]: Connection reset by 101.47.160.247 port 51964 [preauth]
Dec 08 20:20:43 compute-0 podman[221334]: 2025-12-08 20:20:43.776098547 +0000 UTC m=+0.070962650 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, release=1755695350, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public)
Dec 08 20:20:44 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 08 20:20:44 compute-0 nova_compute[187787]: 2025-12-08 20:20:44.525 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:45 compute-0 podman[221611]: 2025-12-08 20:20:45.643996021 +0000 UTC m=+0.070086963 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:20:45 compute-0 podman[221615]: 2025-12-08 20:20:45.650154473 +0000 UTC m=+0.065709364 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:20:46 compute-0 systemd[1]: Starting Time & Date Service...
Dec 08 20:20:46 compute-0 systemd[1]: Started Time & Date Service.
Dec 08 20:20:46 compute-0 nova_compute[187787]: 2025-12-08 20:20:46.505 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:48 compute-0 sshd-session[221724]: Invalid user webserver from 45.174.162.68 port 62777
Dec 08 20:20:48 compute-0 sshd-session[221724]: Received disconnect from 45.174.162.68 port 62777:11: Bye Bye [preauth]
Dec 08 20:20:48 compute-0 sshd-session[221724]: Disconnected from invalid user webserver 45.174.162.68 port 62777 [preauth]
Dec 08 20:20:49 compute-0 nova_compute[187787]: 2025-12-08 20:20:49.527 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:51 compute-0 nova_compute[187787]: 2025-12-08 20:20:51.545 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:54 compute-0 nova_compute[187787]: 2025-12-08 20:20:54.529 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:20:54.996 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:20:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:20:54.998 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:20:54 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:20:54.998 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:20:55 compute-0 podman[221732]: 2025-12-08 20:20:55.301593968 +0000 UTC m=+0.072763396 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:20:55 compute-0 podman[221731]: 2025-12-08 20:20:55.343436242 +0000 UTC m=+0.118596696 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 08 20:20:55 compute-0 sshd-session[221729]: Received disconnect from 103.172.28.62 port 43828:11: Bye Bye [preauth]
Dec 08 20:20:55 compute-0 sshd-session[221729]: Disconnected from authenticating user root 103.172.28.62 port 43828 [preauth]
Dec 08 20:20:56 compute-0 podman[221772]: 2025-12-08 20:20:56.381921983 +0000 UTC m=+0.064532188 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:20:56 compute-0 nova_compute[187787]: 2025-12-08 20:20:56.590 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:59 compute-0 nova_compute[187787]: 2025-12-08 20:20:59.532 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:20:59 compute-0 podman[202017]: time="2025-12-08T20:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:20:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:20:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: ERROR   20:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: ERROR   20:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: ERROR   20:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: ERROR   20:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: ERROR   20:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:21:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:21:01 compute-0 nova_compute[187787]: 2025-12-08 20:21:01.592 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:04 compute-0 nova_compute[187787]: 2025-12-08 20:21:04.534 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:06 compute-0 sudo[218278]: pam_unix(sudo:session): session closed for user root
Dec 08 20:21:06 compute-0 sshd-session[218277]: Received disconnect from 192.168.122.10 port 56034:11: disconnected by user
Dec 08 20:21:06 compute-0 sshd-session[218277]: Disconnected from user zuul 192.168.122.10 port 56034
Dec 08 20:21:06 compute-0 sshd-session[218274]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:21:06 compute-0 systemd[1]: session-26.scope: Deactivated successfully.
Dec 08 20:21:06 compute-0 systemd[1]: session-26.scope: Consumed 1min 18.766s CPU time, 584.7M memory peak, read 224.3M from disk, written 18.4M to disk.
Dec 08 20:21:06 compute-0 systemd-logind[793]: Session 26 logged out. Waiting for processes to exit.
Dec 08 20:21:06 compute-0 systemd-logind[793]: Removed session 26.
Dec 08 20:21:06 compute-0 sshd-session[221794]: Accepted publickey for zuul from 192.168.122.10 port 35072 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:21:06 compute-0 systemd-logind[793]: New session 27 of user zuul.
Dec 08 20:21:06 compute-0 nova_compute[187787]: 2025-12-08 20:21:06.598 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:06 compute-0 systemd[1]: Started Session 27 of User zuul.
Dec 08 20:21:06 compute-0 sshd-session[221794]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:21:06 compute-0 sudo[221800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-12-08-cwebjns.tar.xz
Dec 08 20:21:06 compute-0 sudo[221800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:21:06 compute-0 sudo[221800]: pam_unix(sudo:session): session closed for user root
Dec 08 20:21:06 compute-0 sshd-session[221799]: Received disconnect from 192.168.122.10 port 35072:11: disconnected by user
Dec 08 20:21:06 compute-0 sshd-session[221799]: Disconnected from user zuul 192.168.122.10 port 35072
Dec 08 20:21:06 compute-0 sshd-session[221794]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:21:06 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Dec 08 20:21:06 compute-0 systemd-logind[793]: Session 27 logged out. Waiting for processes to exit.
Dec 08 20:21:06 compute-0 systemd-logind[793]: Removed session 27.
Dec 08 20:21:07 compute-0 sshd-session[221825]: Accepted publickey for zuul from 192.168.122.10 port 35078 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:21:07 compute-0 systemd-logind[793]: New session 28 of user zuul.
Dec 08 20:21:07 compute-0 systemd[1]: Started Session 28 of User zuul.
Dec 08 20:21:07 compute-0 sshd-session[221825]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:21:07 compute-0 sudo[221829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Dec 08 20:21:07 compute-0 sudo[221829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:21:07 compute-0 sudo[221829]: pam_unix(sudo:session): session closed for user root
Dec 08 20:21:07 compute-0 sshd-session[221828]: Received disconnect from 192.168.122.10 port 35078:11: disconnected by user
Dec 08 20:21:07 compute-0 sshd-session[221828]: Disconnected from user zuul 192.168.122.10 port 35078
Dec 08 20:21:07 compute-0 sshd-session[221825]: pam_unix(sshd:session): session closed for user zuul
Dec 08 20:21:07 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Dec 08 20:21:07 compute-0 systemd-logind[793]: Session 28 logged out. Waiting for processes to exit.
Dec 08 20:21:07 compute-0 systemd-logind[793]: Removed session 28.
Dec 08 20:21:09 compute-0 nova_compute[187787]: 2025-12-08 20:21:09.539 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:09 compute-0 podman[221854]: 2025-12-08 20:21:09.554273976 +0000 UTC m=+0.108756037 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Dec 08 20:21:11 compute-0 nova_compute[187787]: 2025-12-08 20:21:11.600 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:12 compute-0 sshd-session[221796]: Invalid user zabbix from 45.78.228.32 port 45494
Dec 08 20:21:13 compute-0 sshd-session[221796]: Received disconnect from 45.78.228.32 port 45494:11: Bye Bye [preauth]
Dec 08 20:21:13 compute-0 sshd-session[221796]: Disconnected from invalid user zabbix 45.78.228.32 port 45494 [preauth]
Dec 08 20:21:14 compute-0 podman[221874]: 2025-12-08 20:21:14.540510361 +0000 UTC m=+0.099862567 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Dec 08 20:21:14 compute-0 nova_compute[187787]: 2025-12-08 20:21:14.542 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:16 compute-0 podman[221895]: 2025-12-08 20:21:16.516019365 +0000 UTC m=+0.079399505 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:21:16 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 08 20:21:16 compute-0 podman[221896]: 2025-12-08 20:21:16.526330589 +0000 UTC m=+0.085511097 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:21:16 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 08 20:21:16 compute-0 nova_compute[187787]: 2025-12-08 20:21:16.603 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:19 compute-0 sshd-session[221941]: Received disconnect from 47.76.127.165 port 56640:11: Bye Bye [preauth]
Dec 08 20:21:19 compute-0 sshd-session[221941]: Disconnected from authenticating user root 47.76.127.165 port 56640 [preauth]
Dec 08 20:21:19 compute-0 nova_compute[187787]: 2025-12-08 20:21:19.543 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.836 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.837 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.839 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac2b7d0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:21:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:21:21 compute-0 nova_compute[187787]: 2025-12-08 20:21:21.605 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:23 compute-0 nova_compute[187787]: 2025-12-08 20:21:23.955 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:24 compute-0 nova_compute[187787]: 2025-12-08 20:21:24.554 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:24 compute-0 nova_compute[187787]: 2025-12-08 20:21:24.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:24 compute-0 nova_compute[187787]: 2025-12-08 20:21:24.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:21:25 compute-0 podman[221946]: 2025-12-08 20:21:25.533155607 +0000 UTC m=+0.079362219 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:21:25 compute-0 podman[221945]: 2025-12-08 20:21:25.568122666 +0000 UTC m=+0.123371550 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 08 20:21:25 compute-0 nova_compute[187787]: 2025-12-08 20:21:25.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:26 compute-0 podman[221988]: 2025-12-08 20:21:26.516207838 +0000 UTC m=+0.080321830 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.607 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.804 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.804 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.805 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.852 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.853 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.853 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:21:26 compute-0 nova_compute[187787]: 2025-12-08 20:21:26.854 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.131 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.134 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5611MB free_disk=72.87666320800781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.134 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.134 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.253 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.255 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.300 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.339 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.341 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:21:27 compute-0 nova_compute[187787]: 2025-12-08 20:21:27.342 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:21:28 compute-0 nova_compute[187787]: 2025-12-08 20:21:28.317 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:28 compute-0 nova_compute[187787]: 2025-12-08 20:21:28.318 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:29 compute-0 nova_compute[187787]: 2025-12-08 20:21:29.556 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:29 compute-0 podman[202017]: time="2025-12-08T20:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:21:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:21:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Dec 08 20:21:29 compute-0 nova_compute[187787]: 2025-12-08 20:21:29.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: ERROR   20:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: ERROR   20:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: ERROR   20:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: ERROR   20:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: ERROR   20:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:21:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:21:31 compute-0 nova_compute[187787]: 2025-12-08 20:21:31.609 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:34 compute-0 nova_compute[187787]: 2025-12-08 20:21:34.559 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:36 compute-0 nova_compute[187787]: 2025-12-08 20:21:36.611 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:39 compute-0 nova_compute[187787]: 2025-12-08 20:21:39.561 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:40 compute-0 podman[222014]: 2025-12-08 20:21:40.504706433 +0000 UTC m=+0.082510419 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm)
Dec 08 20:21:41 compute-0 nova_compute[187787]: 2025-12-08 20:21:41.613 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:44 compute-0 nova_compute[187787]: 2025-12-08 20:21:44.563 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:44 compute-0 podman[222036]: 2025-12-08 20:21:44.746306676 +0000 UTC m=+0.073249658 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal)
Dec 08 20:21:45 compute-0 sshd-session[222034]: Invalid user harry from 45.78.217.210 port 45506
Dec 08 20:21:46 compute-0 sshd-session[222034]: Received disconnect from 45.78.217.210 port 45506:11: Bye Bye [preauth]
Dec 08 20:21:46 compute-0 sshd-session[222034]: Disconnected from invalid user harry 45.78.217.210 port 45506 [preauth]
Dec 08 20:21:46 compute-0 nova_compute[187787]: 2025-12-08 20:21:46.614 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:47 compute-0 podman[222059]: 2025-12-08 20:21:47.487087675 +0000 UTC m=+0.057488709 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:21:47 compute-0 podman[222058]: 2025-12-08 20:21:47.496335693 +0000 UTC m=+0.066216561 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:21:49 compute-0 nova_compute[187787]: 2025-12-08 20:21:49.566 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:51 compute-0 nova_compute[187787]: 2025-12-08 20:21:51.667 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:54 compute-0 nova_compute[187787]: 2025-12-08 20:21:54.568 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:21:54.999 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:21:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:21:55.000 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:21:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:21:55.000 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:21:56 compute-0 podman[222104]: 2025-12-08 20:21:56.543523765 +0000 UTC m=+0.099191906 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 08 20:21:56 compute-0 podman[222103]: 2025-12-08 20:21:56.546187549 +0000 UTC m=+0.115765413 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Dec 08 20:21:56 compute-0 podman[222147]: 2025-12-08 20:21:56.645169998 +0000 UTC m=+0.067178571 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:21:56 compute-0 nova_compute[187787]: 2025-12-08 20:21:56.670 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:59 compute-0 nova_compute[187787]: 2025-12-08 20:21:59.569 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:21:59 compute-0 podman[202017]: time="2025-12-08T20:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:21:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:21:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: ERROR   20:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: ERROR   20:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: ERROR   20:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: ERROR   20:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: ERROR   20:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:22:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:22:01 compute-0 nova_compute[187787]: 2025-12-08 20:22:01.672 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:02 compute-0 sshd-session[222171]: Received disconnect from 45.174.162.68 port 65477:11: Bye Bye [preauth]
Dec 08 20:22:02 compute-0 sshd-session[222171]: Disconnected from authenticating user root 45.174.162.68 port 65477 [preauth]
Dec 08 20:22:04 compute-0 nova_compute[187787]: 2025-12-08 20:22:04.571 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:05 compute-0 sshd-session[222174]: Received disconnect from 200.155.38.219 port 14021:11: Bye Bye [preauth]
Dec 08 20:22:05 compute-0 sshd-session[222174]: Disconnected from authenticating user root 200.155.38.219 port 14021 [preauth]
Dec 08 20:22:06 compute-0 nova_compute[187787]: 2025-12-08 20:22:06.674 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:09 compute-0 nova_compute[187787]: 2025-12-08 20:22:09.573 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:11 compute-0 podman[222176]: 2025-12-08 20:22:11.496180902 +0000 UTC m=+0.065863749 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:22:11 compute-0 nova_compute[187787]: 2025-12-08 20:22:11.688 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:14 compute-0 nova_compute[187787]: 2025-12-08 20:22:14.574 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:15 compute-0 podman[222197]: 2025-12-08 20:22:15.508462084 +0000 UTC m=+0.083483238 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 08 20:22:16 compute-0 nova_compute[187787]: 2025-12-08 20:22:16.690 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:18 compute-0 podman[222220]: 2025-12-08 20:22:18.497968868 +0000 UTC m=+0.065266621 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 08 20:22:18 compute-0 podman[222219]: 2025-12-08 20:22:18.501916602 +0000 UTC m=+0.072090624 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:22:19 compute-0 nova_compute[187787]: 2025-12-08 20:22:19.576 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:21 compute-0 nova_compute[187787]: 2025-12-08 20:22:21.692 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:24 compute-0 nova_compute[187787]: 2025-12-08 20:22:24.578 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:24 compute-0 nova_compute[187787]: 2025-12-08 20:22:24.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:24 compute-0 nova_compute[187787]: 2025-12-08 20:22:24.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:24 compute-0 nova_compute[187787]: 2025-12-08 20:22:24.779 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:22:25 compute-0 sshd-session[222264]: Received disconnect from 103.172.28.62 port 60072:11: Bye Bye [preauth]
Dec 08 20:22:25 compute-0 sshd-session[222264]: Disconnected from authenticating user root 103.172.28.62 port 60072 [preauth]
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.695 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.803 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.804 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.851 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.852 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.852 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:22:26 compute-0 nova_compute[187787]: 2025-12-08 20:22:26.852 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.074 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.075 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5623MB free_disk=72.87666320800781GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.075 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.075 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.203 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.204 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.231 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.259 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.261 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:22:27 compute-0 nova_compute[187787]: 2025-12-08 20:22:27.262 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.186s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:22:27 compute-0 podman[222266]: 2025-12-08 20:22:27.486994564 +0000 UTC m=+0.056609813 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:22:27 compute-0 podman[222271]: 2025-12-08 20:22:27.536221945 +0000 UTC m=+0.085188772 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 08 20:22:27 compute-0 podman[222267]: 2025-12-08 20:22:27.562063059 +0000 UTC m=+0.119812128 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller)
Dec 08 20:22:28 compute-0 nova_compute[187787]: 2025-12-08 20:22:28.238 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:28 compute-0 nova_compute[187787]: 2025-12-08 20:22:28.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:28 compute-0 nova_compute[187787]: 2025-12-08 20:22:28.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:28 compute-0 nova_compute[187787]: 2025-12-08 20:22:28.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:29 compute-0 nova_compute[187787]: 2025-12-08 20:22:29.580 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:29 compute-0 podman[202017]: time="2025-12-08T20:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:22:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:22:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Dec 08 20:22:29 compute-0 nova_compute[187787]: 2025-12-08 20:22:29.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: ERROR   20:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: ERROR   20:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: ERROR   20:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: ERROR   20:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: ERROR   20:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:22:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:22:31 compute-0 nova_compute[187787]: 2025-12-08 20:22:31.697 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:32 compute-0 nova_compute[187787]: 2025-12-08 20:22:32.774 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:22:34 compute-0 nova_compute[187787]: 2025-12-08 20:22:34.582 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:36 compute-0 nova_compute[187787]: 2025-12-08 20:22:36.699 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:39 compute-0 nova_compute[187787]: 2025-12-08 20:22:39.584 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:41 compute-0 nova_compute[187787]: 2025-12-08 20:22:41.701 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:42 compute-0 podman[222334]: 2025-12-08 20:22:42.510120361 +0000 UTC m=+0.076586923 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 08 20:22:44 compute-0 nova_compute[187787]: 2025-12-08 20:22:44.585 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:46 compute-0 podman[222354]: 2025-12-08 20:22:46.516247161 +0000 UTC m=+0.088027979 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 08 20:22:46 compute-0 nova_compute[187787]: 2025-12-08 20:22:46.740 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:49 compute-0 sshd-session[222375]: Received disconnect from 47.76.127.165 port 57116:11: Bye Bye [preauth]
Dec 08 20:22:49 compute-0 sshd-session[222375]: Disconnected from authenticating user root 47.76.127.165 port 57116 [preauth]
Dec 08 20:22:49 compute-0 podman[222378]: 2025-12-08 20:22:49.485693792 +0000 UTC m=+0.055935441 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 08 20:22:49 compute-0 podman[222377]: 2025-12-08 20:22:49.509242974 +0000 UTC m=+0.084467639 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:22:49 compute-0 nova_compute[187787]: 2025-12-08 20:22:49.586 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:51 compute-0 nova_compute[187787]: 2025-12-08 20:22:51.743 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:54 compute-0 nova_compute[187787]: 2025-12-08 20:22:54.588 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:22:55.000 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:22:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:22:55.001 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:22:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:22:55.001 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:22:56 compute-0 nova_compute[187787]: 2025-12-08 20:22:56.747 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:58 compute-0 podman[222422]: 2025-12-08 20:22:58.498622281 +0000 UTC m=+0.067327155 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:22:58 compute-0 podman[222424]: 2025-12-08 20:22:58.518013914 +0000 UTC m=+0.075343145 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 08 20:22:58 compute-0 podman[222423]: 2025-12-08 20:22:58.543256939 +0000 UTC m=+0.093108966 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 08 20:22:59 compute-0 nova_compute[187787]: 2025-12-08 20:22:59.590 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:22:59 compute-0 podman[202017]: time="2025-12-08T20:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:22:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:22:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: ERROR   20:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: ERROR   20:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: ERROR   20:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: ERROR   20:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: ERROR   20:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:23:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:23:01 compute-0 anacron[105339]: Job `cron.daily' started
Dec 08 20:23:01 compute-0 anacron[105339]: Job `cron.daily' terminated
Dec 08 20:23:01 compute-0 nova_compute[187787]: 2025-12-08 20:23:01.748 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:04 compute-0 nova_compute[187787]: 2025-12-08 20:23:04.592 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:06 compute-0 nova_compute[187787]: 2025-12-08 20:23:06.751 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:09 compute-0 nova_compute[187787]: 2025-12-08 20:23:09.594 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:11 compute-0 nova_compute[187787]: 2025-12-08 20:23:11.757 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:13 compute-0 podman[222494]: 2025-12-08 20:23:13.516998472 +0000 UTC m=+0.079901437 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:23:14 compute-0 nova_compute[187787]: 2025-12-08 20:23:14.624 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:16 compute-0 nova_compute[187787]: 2025-12-08 20:23:16.757 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:17 compute-0 podman[222515]: 2025-12-08 20:23:17.538256642 +0000 UTC m=+0.109036213 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, maintainer=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Dec 08 20:23:18 compute-0 sshd-session[222513]: Invalid user httpd from 45.174.162.68 port 1371
Dec 08 20:23:18 compute-0 sshd-session[222513]: Received disconnect from 45.174.162.68 port 1371:11: Bye Bye [preauth]
Dec 08 20:23:18 compute-0 sshd-session[222513]: Disconnected from invalid user httpd 45.174.162.68 port 1371 [preauth]
Dec 08 20:23:19 compute-0 nova_compute[187787]: 2025-12-08 20:23:19.660 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.837 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.837 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.837 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.838 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.853 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:23:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:23:20 compute-0 podman[222539]: 2025-12-08 20:23:20.511949885 +0000 UTC m=+0.075405067 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 08 20:23:20 compute-0 podman[222538]: 2025-12-08 20:23:20.528693116 +0000 UTC m=+0.092004023 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:23:21 compute-0 nova_compute[187787]: 2025-12-08 20:23:21.808 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:24 compute-0 nova_compute[187787]: 2025-12-08 20:23:24.666 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:24 compute-0 nova_compute[187787]: 2025-12-08 20:23:24.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:24 compute-0 nova_compute[187787]: 2025-12-08 20:23:24.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:23:24 compute-0 nova_compute[187787]: 2025-12-08 20:23:24.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:24 compute-0 nova_compute[187787]: 2025-12-08 20:23:24.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 08 20:23:25 compute-0 nova_compute[187787]: 2025-12-08 20:23:25.793 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.809 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.825 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.825 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.825 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:23:26 compute-0 nova_compute[187787]: 2025-12-08 20:23:26.826 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.048 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.049 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5631MB free_disk=72.87665939331055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.049 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.050 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.275 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.276 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.379 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.401 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.404 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:23:27 compute-0 nova_compute[187787]: 2025-12-08 20:23:27.404 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.405 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.406 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.407 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.434 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.435 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 podman[222585]: 2025-12-08 20:23:29.52907888 +0000 UTC m=+0.089114421 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:23:29 compute-0 podman[222586]: 2025-12-08 20:23:29.568996161 +0000 UTC m=+0.132383777 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 08 20:23:29 compute-0 podman[222587]: 2025-12-08 20:23:29.580015424 +0000 UTC m=+0.130361573 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.669 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:29 compute-0 podman[202017]: time="2025-12-08T20:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:23:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:23:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.782 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 08 20:23:29 compute-0 nova_compute[187787]: 2025-12-08 20:23:29.809 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 08 20:23:30 compute-0 sshd-session[222582]: Received disconnect from 45.78.228.32 port 35074:11: Bye Bye [preauth]
Dec 08 20:23:30 compute-0 sshd-session[222582]: Disconnected from authenticating user root 45.78.228.32 port 35074 [preauth]
Dec 08 20:23:30 compute-0 nova_compute[187787]: 2025-12-08 20:23:30.808 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: ERROR   20:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: ERROR   20:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: ERROR   20:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: ERROR   20:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: ERROR   20:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:23:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:23:31 compute-0 nova_compute[187787]: 2025-12-08 20:23:31.812 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:34 compute-0 nova_compute[187787]: 2025-12-08 20:23:34.722 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:35 compute-0 nova_compute[187787]: 2025-12-08 20:23:35.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:23:36 compute-0 nova_compute[187787]: 2025-12-08 20:23:36.813 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:39 compute-0 nova_compute[187787]: 2025-12-08 20:23:39.775 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:41 compute-0 nova_compute[187787]: 2025-12-08 20:23:41.813 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:44 compute-0 podman[222653]: 2025-12-08 20:23:44.501235505 +0000 UTC m=+0.078076238 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Dec 08 20:23:44 compute-0 nova_compute[187787]: 2025-12-08 20:23:44.812 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:46 compute-0 nova_compute[187787]: 2025-12-08 20:23:46.816 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:48 compute-0 podman[222675]: 2025-12-08 20:23:48.538749546 +0000 UTC m=+0.103529949 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 08 20:23:49 compute-0 nova_compute[187787]: 2025-12-08 20:23:49.849 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:51 compute-0 podman[222698]: 2025-12-08 20:23:51.522639182 +0000 UTC m=+0.080963358 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:23:51 compute-0 podman[222699]: 2025-12-08 20:23:51.556886207 +0000 UTC m=+0.111376974 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:23:51 compute-0 nova_compute[187787]: 2025-12-08 20:23:51.820 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:54 compute-0 sshd-session[222741]: Received disconnect from 200.155.38.219 port 5744:11: Bye Bye [preauth]
Dec 08 20:23:54 compute-0 sshd-session[222741]: Disconnected from authenticating user root 200.155.38.219 port 5744 [preauth]
Dec 08 20:23:54 compute-0 nova_compute[187787]: 2025-12-08 20:23:54.852 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:23:55.000 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:23:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:23:55.001 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:23:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:23:55.003 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:23:56 compute-0 nova_compute[187787]: 2025-12-08 20:23:56.821 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:23:59 compute-0 sshd-session[222743]: Received disconnect from 103.172.28.62 port 39098:11: Bye Bye [preauth]
Dec 08 20:23:59 compute-0 sshd-session[222743]: Disconnected from authenticating user root 103.172.28.62 port 39098 [preauth]
Dec 08 20:23:59 compute-0 podman[202017]: time="2025-12-08T20:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:23:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:23:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Dec 08 20:23:59 compute-0 nova_compute[187787]: 2025-12-08 20:23:59.857 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:00 compute-0 podman[222745]: 2025-12-08 20:24:00.503748673 +0000 UTC m=+0.070452412 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:24:00 compute-0 podman[222747]: 2025-12-08 20:24:00.553592122 +0000 UTC m=+0.098438800 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 08 20:24:00 compute-0 podman[222746]: 2025-12-08 20:24:00.562562131 +0000 UTC m=+0.117515874 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: ERROR   20:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: ERROR   20:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: ERROR   20:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: ERROR   20:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: ERROR   20:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:24:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:24:01 compute-0 nova_compute[187787]: 2025-12-08 20:24:01.825 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:04 compute-0 nova_compute[187787]: 2025-12-08 20:24:04.893 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:06 compute-0 nova_compute[187787]: 2025-12-08 20:24:06.829 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:09 compute-0 nova_compute[187787]: 2025-12-08 20:24:09.937 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:11 compute-0 nova_compute[187787]: 2025-12-08 20:24:11.835 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:14 compute-0 podman[222815]: 2025-12-08 20:24:14.749073444 +0000 UTC m=+0.075939512 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_id=edpm)
Dec 08 20:24:14 compute-0 nova_compute[187787]: 2025-12-08 20:24:14.942 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:16 compute-0 nova_compute[187787]: 2025-12-08 20:24:16.884 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:19 compute-0 podman[222836]: 2025-12-08 20:24:19.494211184 +0000 UTC m=+0.069419299 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Dec 08 20:24:19 compute-0 nova_compute[187787]: 2025-12-08 20:24:19.946 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:21 compute-0 nova_compute[187787]: 2025-12-08 20:24:21.887 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:22 compute-0 podman[222862]: 2025-12-08 20:24:22.517527515 +0000 UTC m=+0.073880738 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:24:22 compute-0 podman[222863]: 2025-12-08 20:24:22.541305004 +0000 UTC m=+0.095761118 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:24:22 compute-0 sshd-session[222858]: Received disconnect from 45.78.217.210 port 53550:11: Bye Bye [preauth]
Dec 08 20:24:22 compute-0 sshd-session[222858]: Disconnected from authenticating user root 45.78.217.210 port 53550 [preauth]
Dec 08 20:24:23 compute-0 sshd-session[222860]: Invalid user administrator from 47.76.127.165 port 57392
Dec 08 20:24:23 compute-0 sshd-session[222860]: Received disconnect from 47.76.127.165 port 57392:11: Bye Bye [preauth]
Dec 08 20:24:23 compute-0 sshd-session[222860]: Disconnected from invalid user administrator 47.76.127.165 port 57392 [preauth]
Dec 08 20:24:24 compute-0 nova_compute[187787]: 2025-12-08 20:24:24.842 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:24 compute-0 nova_compute[187787]: 2025-12-08 20:24:24.843 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:24:24 compute-0 nova_compute[187787]: 2025-12-08 20:24:24.949 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:26 compute-0 nova_compute[187787]: 2025-12-08 20:24:26.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:26 compute-0 nova_compute[187787]: 2025-12-08 20:24:26.890 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.808 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.809 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.809 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.845 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.845 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.845 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:24:28 compute-0 nova_compute[187787]: 2025-12-08 20:24:28.846 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.049 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.050 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5632MB free_disk=72.87673950195312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.051 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.051 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.135 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.136 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.309 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.327 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.329 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.329 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:24:29 compute-0 podman[202017]: time="2025-12-08T20:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:24:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:24:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Dec 08 20:24:29 compute-0 nova_compute[187787]: 2025-12-08 20:24:29.954 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:30 compute-0 nova_compute[187787]: 2025-12-08 20:24:30.300 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:30 compute-0 nova_compute[187787]: 2025-12-08 20:24:30.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:30 compute-0 nova_compute[187787]: 2025-12-08 20:24:30.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: ERROR   20:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: ERROR   20:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: ERROR   20:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: ERROR   20:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: ERROR   20:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:24:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:24:31 compute-0 podman[222906]: 2025-12-08 20:24:31.5093838 +0000 UTC m=+0.066794897 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:24:31 compute-0 podman[222904]: 2025-12-08 20:24:31.518770692 +0000 UTC m=+0.087378317 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:24:31 compute-0 podman[222905]: 2025-12-08 20:24:31.581796311 +0000 UTC m=+0.141100957 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Dec 08 20:24:31 compute-0 nova_compute[187787]: 2025-12-08 20:24:31.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:31 compute-0 nova_compute[187787]: 2025-12-08 20:24:31.893 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:34 compute-0 nova_compute[187787]: 2025-12-08 20:24:34.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:24:34 compute-0 nova_compute[187787]: 2025-12-08 20:24:34.957 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:36 compute-0 nova_compute[187787]: 2025-12-08 20:24:36.896 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:39 compute-0 sshd-session[222972]: Received disconnect from 45.174.162.68 port 55315:11: Bye Bye [preauth]
Dec 08 20:24:39 compute-0 sshd-session[222972]: Disconnected from authenticating user root 45.174.162.68 port 55315 [preauth]
Dec 08 20:24:39 compute-0 nova_compute[187787]: 2025-12-08 20:24:39.961 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:41 compute-0 nova_compute[187787]: 2025-12-08 20:24:41.898 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:44 compute-0 nova_compute[187787]: 2025-12-08 20:24:44.965 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:45 compute-0 podman[222974]: 2025-12-08 20:24:45.50126098 +0000 UTC m=+0.069537713 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 08 20:24:46 compute-0 nova_compute[187787]: 2025-12-08 20:24:46.900 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:49 compute-0 nova_compute[187787]: 2025-12-08 20:24:49.970 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:50 compute-0 podman[222996]: 2025-12-08 20:24:50.506981651 +0000 UTC m=+0.070494192 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6)
Dec 08 20:24:51 compute-0 nova_compute[187787]: 2025-12-08 20:24:51.902 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:53 compute-0 podman[223018]: 2025-12-08 20:24:53.521111787 +0000 UTC m=+0.081635018 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 08 20:24:53 compute-0 podman[223017]: 2025-12-08 20:24:53.536047081 +0000 UTC m=+0.103216169 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 08 20:24:54 compute-0 nova_compute[187787]: 2025-12-08 20:24:54.997 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:24:55.001 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:24:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:24:55.002 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:24:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:24:55.002 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:24:56 compute-0 nova_compute[187787]: 2025-12-08 20:24:56.907 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:24:59 compute-0 podman[202017]: time="2025-12-08T20:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:24:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:24:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Dec 08 20:25:00 compute-0 nova_compute[187787]: 2025-12-08 20:25:00.001 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: ERROR   20:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: ERROR   20:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: ERROR   20:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: ERROR   20:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: ERROR   20:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:25:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:25:01 compute-0 nova_compute[187787]: 2025-12-08 20:25:01.908 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:02 compute-0 podman[223057]: 2025-12-08 20:25:02.499917287 +0000 UTC m=+0.071017388 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:25:02 compute-0 podman[223059]: 2025-12-08 20:25:02.510904049 +0000 UTC m=+0.074578530 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:25:02 compute-0 podman[223058]: 2025-12-08 20:25:02.553693159 +0000 UTC m=+0.111389104 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:25:05 compute-0 nova_compute[187787]: 2025-12-08 20:25:05.035 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:06 compute-0 nova_compute[187787]: 2025-12-08 20:25:06.910 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:10 compute-0 nova_compute[187787]: 2025-12-08 20:25:10.040 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:11 compute-0 nova_compute[187787]: 2025-12-08 20:25:11.911 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:15 compute-0 nova_compute[187787]: 2025-12-08 20:25:15.045 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:16 compute-0 podman[223123]: 2025-12-08 20:25:16.520903631 +0000 UTC m=+0.090603598 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm)
Dec 08 20:25:16 compute-0 nova_compute[187787]: 2025-12-08 20:25:16.911 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.837 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.838 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.838 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.839 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.840 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.841 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac4f2f0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.854 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.855 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:25:19.856 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:25:20 compute-0 nova_compute[187787]: 2025-12-08 20:25:20.049 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:21 compute-0 podman[223148]: 2025-12-08 20:25:21.51013468 +0000 UTC m=+0.074198898 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal)
Dec 08 20:25:21 compute-0 nova_compute[187787]: 2025-12-08 20:25:21.914 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:24 compute-0 podman[223171]: 2025-12-08 20:25:24.488279376 +0000 UTC m=+0.052148071 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:25:24 compute-0 podman[223170]: 2025-12-08 20:25:24.495607744 +0000 UTC m=+0.060775030 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:25:24 compute-0 nova_compute[187787]: 2025-12-08 20:25:24.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:24 compute-0 nova_compute[187787]: 2025-12-08 20:25:24.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:25:25 compute-0 nova_compute[187787]: 2025-12-08 20:25:25.052 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:26 compute-0 nova_compute[187787]: 2025-12-08 20:25:26.914 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:27 compute-0 nova_compute[187787]: 2025-12-08 20:25:27.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:28 compute-0 nova_compute[187787]: 2025-12-08 20:25:28.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:28 compute-0 nova_compute[187787]: 2025-12-08 20:25:28.828 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:25:28 compute-0 nova_compute[187787]: 2025-12-08 20:25:28.829 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:25:28 compute-0 nova_compute[187787]: 2025-12-08 20:25:28.829 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:25:28 compute-0 nova_compute[187787]: 2025-12-08 20:25:28.830 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.002 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.003 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5627MB free_disk=72.87625122070312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.004 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.004 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.073 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.073 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.090 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing inventories for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.118 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating ProviderTree inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.119 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.137 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing aggregate associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.160 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing trait associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_FMA3,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.190 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.209 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.211 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:25:29 compute-0 nova_compute[187787]: 2025-12-08 20:25:29.211 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:25:29 compute-0 podman[202017]: time="2025-12-08T20:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:25:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:25:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3461 "" "Go-http-client/1.1"
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.057 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.215 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.784 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.786 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.786 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:25:30 compute-0 nova_compute[187787]: 2025-12-08 20:25:30.825 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:25:31 compute-0 sshd[129409]: Timeout before authentication for connection from 101.47.160.247 to 38.102.83.66, pid = 222584
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: ERROR   20:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: ERROR   20:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: ERROR   20:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: ERROR   20:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: ERROR   20:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:25:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:25:31 compute-0 nova_compute[187787]: 2025-12-08 20:25:31.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:31 compute-0 nova_compute[187787]: 2025-12-08 20:25:31.916 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:32 compute-0 nova_compute[187787]: 2025-12-08 20:25:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:32 compute-0 nova_compute[187787]: 2025-12-08 20:25:32.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:32 compute-0 nova_compute[187787]: 2025-12-08 20:25:32.782 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:25:33 compute-0 podman[223215]: 2025-12-08 20:25:33.515186912 +0000 UTC m=+0.077129329 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:25:33 compute-0 podman[223217]: 2025-12-08 20:25:33.564700441 +0000 UTC m=+0.112360724 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 08 20:25:33 compute-0 podman[223216]: 2025-12-08 20:25:33.60422681 +0000 UTC m=+0.156754374 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:25:35 compute-0 nova_compute[187787]: 2025-12-08 20:25:35.061 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:36 compute-0 nova_compute[187787]: 2025-12-08 20:25:36.919 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:38 compute-0 sshd-session[223280]: Received disconnect from 103.172.28.62 port 42474:11: Bye Bye [preauth]
Dec 08 20:25:38 compute-0 sshd-session[223280]: Disconnected from authenticating user root 103.172.28.62 port 42474 [preauth]
Dec 08 20:25:40 compute-0 nova_compute[187787]: 2025-12-08 20:25:40.063 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:41 compute-0 nova_compute[187787]: 2025-12-08 20:25:41.922 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:44 compute-0 sshd[129409]: drop connection #0 from [101.47.160.247]:43808 on [38.102.83.66]:22 penalty: exceeded LoginGraceTime
Dec 08 20:25:45 compute-0 nova_compute[187787]: 2025-12-08 20:25:45.067 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:46 compute-0 nova_compute[187787]: 2025-12-08 20:25:46.923 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:47 compute-0 podman[223284]: 2025-12-08 20:25:47.516317213 +0000 UTC m=+0.083239426 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42)
Dec 08 20:25:47 compute-0 sshd-session[223282]: Received disconnect from 200.155.38.219 port 52272:11: Bye Bye [preauth]
Dec 08 20:25:47 compute-0 sshd-session[223282]: Disconnected from authenticating user root 200.155.38.219 port 52272 [preauth]
Dec 08 20:25:50 compute-0 nova_compute[187787]: 2025-12-08 20:25:50.070 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:51 compute-0 nova_compute[187787]: 2025-12-08 20:25:51.925 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:52 compute-0 podman[223305]: 2025-12-08 20:25:52.557451831 +0000 UTC m=+0.115769016 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git)
Dec 08 20:25:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:25:55.002 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:25:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:25:55.002 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:25:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:25:55.002 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:25:55 compute-0 nova_compute[187787]: 2025-12-08 20:25:55.075 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:55 compute-0 podman[223327]: 2025-12-08 20:25:55.505660279 +0000 UTC m=+0.066961114 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:25:55 compute-0 podman[223328]: 2025-12-08 20:25:55.543714044 +0000 UTC m=+0.095317154 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:25:56 compute-0 nova_compute[187787]: 2025-12-08 20:25:56.927 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:25:57 compute-0 sshd-session[223371]: Received disconnect from 222.172.32.246 port 2184:11: Bye Bye [preauth]
Dec 08 20:25:57 compute-0 sshd-session[223371]: Disconnected from authenticating user root 222.172.32.246 port 2184 [preauth]
Dec 08 20:25:59 compute-0 podman[202017]: time="2025-12-08T20:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:25:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:25:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3460 "" "Go-http-client/1.1"
Dec 08 20:26:00 compute-0 nova_compute[187787]: 2025-12-08 20:26:00.104 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:00 compute-0 sshd-session[223375]: Invalid user edge from 47.76.127.165 port 35238
Dec 08 20:26:00 compute-0 sshd-session[223375]: Received disconnect from 47.76.127.165 port 35238:11: Bye Bye [preauth]
Dec 08 20:26:00 compute-0 sshd-session[223375]: Disconnected from invalid user edge 47.76.127.165 port 35238 [preauth]
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: ERROR   20:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: ERROR   20:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: ERROR   20:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: ERROR   20:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: ERROR   20:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:26:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:26:01 compute-0 nova_compute[187787]: 2025-12-08 20:26:01.929 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:04 compute-0 podman[223377]: 2025-12-08 20:26:04.516299695 +0000 UTC m=+0.076642017 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:26:04 compute-0 podman[223379]: 2025-12-08 20:26:04.544112358 +0000 UTC m=+0.091803524 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 08 20:26:04 compute-0 podman[223378]: 2025-12-08 20:26:04.591977581 +0000 UTC m=+0.149990360 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 08 20:26:05 compute-0 nova_compute[187787]: 2025-12-08 20:26:05.107 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:06 compute-0 nova_compute[187787]: 2025-12-08 20:26:06.932 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:10 compute-0 nova_compute[187787]: 2025-12-08 20:26:10.111 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:11 compute-0 nova_compute[187787]: 2025-12-08 20:26:11.934 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:12 compute-0 nova_compute[187787]: 2025-12-08 20:26:12.092 187791 DEBUG oslo_concurrency.processutils [None req-ea4f8cf0-569b-4a3f-9f19-994ec936e20d 334a76f25c164f46a25714e3003b5898 aeda4e9ec2bc42cf85eb51bfa0b2ae46 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Dec 08 20:26:12 compute-0 nova_compute[187787]: 2025-12-08 20:26:12.137 187791 DEBUG oslo_concurrency.processutils [None req-ea4f8cf0-569b-4a3f-9f19-994ec936e20d 334a76f25c164f46a25714e3003b5898 aeda4e9ec2bc42cf85eb51bfa0b2ae46 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Dec 08 20:26:15 compute-0 nova_compute[187787]: 2025-12-08 20:26:15.114 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:16 compute-0 nova_compute[187787]: 2025-12-08 20:26:16.936 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:18 compute-0 podman[223444]: 2025-12-08 20:26:18.523439289 +0000 UTC m=+0.083324807 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 08 20:26:18 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:18.656 105024 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ea:67:f9', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '1e:d7:e5:ba:bd:f4'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Dec 08 20:26:18 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:18.658 105024 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Dec 08 20:26:18 compute-0 nova_compute[187787]: 2025-12-08 20:26:18.692 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:20 compute-0 nova_compute[187787]: 2025-12-08 20:26:20.118 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:21 compute-0 nova_compute[187787]: 2025-12-08 20:26:21.939 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:23 compute-0 podman[223465]: 2025-12-08 20:26:23.506299208 +0000 UTC m=+0.077851036 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 08 20:26:24 compute-0 nova_compute[187787]: 2025-12-08 20:26:24.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:24 compute-0 nova_compute[187787]: 2025-12-08 20:26:24.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:26:25 compute-0 nova_compute[187787]: 2025-12-08 20:26:25.121 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:26 compute-0 podman[223486]: 2025-12-08 20:26:26.4954173 +0000 UTC m=+0.065979523 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:26:26 compute-0 podman[223487]: 2025-12-08 20:26:26.519218738 +0000 UTC m=+0.084022860 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 08 20:26:26 compute-0 nova_compute[187787]: 2025-12-08 20:26:26.940 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:27 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:27.661 105024 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=7a8539fb-8779-42f7-8fa8-222db61ea5ae, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Dec 08 20:26:29 compute-0 podman[202017]: time="2025-12-08T20:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:26:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:26:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.817 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.817 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:26:29 compute-0 nova_compute[187787]: 2025-12-08 20:26:29.818 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.000 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.001 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5638MB free_disk=72.8762321472168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.001 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.002 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.092 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.093 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.120 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.125 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.138 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.140 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:26:30 compute-0 nova_compute[187787]: 2025-12-08 20:26:30.141 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: ERROR   20:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: ERROR   20:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: ERROR   20:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: ERROR   20:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: ERROR   20:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:26:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:26:31 compute-0 nova_compute[187787]: 2025-12-08 20:26:31.942 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.142 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.142 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.143 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.173 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.174 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.174 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:32 compute-0 nova_compute[187787]: 2025-12-08 20:26:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:33 compute-0 nova_compute[187787]: 2025-12-08 20:26:33.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:35 compute-0 nova_compute[187787]: 2025-12-08 20:26:35.127 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:35 compute-0 podman[223525]: 2025-12-08 20:26:35.516557686 +0000 UTC m=+0.077646889 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:26:35 compute-0 podman[223527]: 2025-12-08 20:26:35.526181698 +0000 UTC m=+0.086637062 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd)
Dec 08 20:26:35 compute-0 podman[223526]: 2025-12-08 20:26:35.566211155 +0000 UTC m=+0.119059119 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 08 20:26:35 compute-0 nova_compute[187787]: 2025-12-08 20:26:35.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:26:36 compute-0 nova_compute[187787]: 2025-12-08 20:26:36.944 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:40 compute-0 nova_compute[187787]: 2025-12-08 20:26:40.131 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:41 compute-0 nova_compute[187787]: 2025-12-08 20:26:41.945 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:45 compute-0 nova_compute[187787]: 2025-12-08 20:26:45.136 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:46 compute-0 nova_compute[187787]: 2025-12-08 20:26:46.948 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:49 compute-0 podman[223591]: 2025-12-08 20:26:49.513759297 +0000 UTC m=+0.084078261 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 08 20:26:50 compute-0 nova_compute[187787]: 2025-12-08 20:26:50.166 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:51 compute-0 nova_compute[187787]: 2025-12-08 20:26:51.949 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:54 compute-0 podman[223612]: 2025-12-08 20:26:54.542721073 +0000 UTC m=+0.099474966 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 08 20:26:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:55.004 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:26:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:55.005 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:26:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:26:55.005 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:26:55 compute-0 nova_compute[187787]: 2025-12-08 20:26:55.211 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:56 compute-0 nova_compute[187787]: 2025-12-08 20:26:56.954 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:26:57 compute-0 podman[223634]: 2025-12-08 20:26:57.50491021 +0000 UTC m=+0.072166947 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:26:57 compute-0 podman[223635]: 2025-12-08 20:26:57.511996992 +0000 UTC m=+0.080728376 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 08 20:26:59 compute-0 podman[202017]: time="2025-12-08T20:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:26:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:26:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3469 "" "Go-http-client/1.1"
Dec 08 20:27:00 compute-0 nova_compute[187787]: 2025-12-08 20:27:00.217 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: ERROR   20:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: ERROR   20:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: ERROR   20:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: ERROR   20:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: ERROR   20:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:27:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:27:01 compute-0 nova_compute[187787]: 2025-12-08 20:27:01.955 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:05 compute-0 nova_compute[187787]: 2025-12-08 20:27:05.220 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:06 compute-0 podman[223677]: 2025-12-08 20:27:06.507223423 +0000 UTC m=+0.069255045 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:27:06 compute-0 podman[223679]: 2025-12-08 20:27:06.511044764 +0000 UTC m=+0.067222132 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:27:06 compute-0 podman[223678]: 2025-12-08 20:27:06.543517663 +0000 UTC m=+0.105767902 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:27:06 compute-0 nova_compute[187787]: 2025-12-08 20:27:06.960 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:09 compute-0 sshd-session[223633]: error: kex_exchange_identification: read: Connection reset by peer
Dec 08 20:27:09 compute-0 sshd-session[223633]: Connection reset by 45.78.217.210 port 33524
Dec 08 20:27:10 compute-0 nova_compute[187787]: 2025-12-08 20:27:10.223 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:11 compute-0 nova_compute[187787]: 2025-12-08 20:27:11.961 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:13 compute-0 sshd-session[223747]: Invalid user amir from 103.172.28.62 port 39052
Dec 08 20:27:13 compute-0 sshd-session[223747]: Received disconnect from 103.172.28.62 port 39052:11: Bye Bye [preauth]
Dec 08 20:27:13 compute-0 sshd-session[223747]: Disconnected from invalid user amir 103.172.28.62 port 39052 [preauth]
Dec 08 20:27:15 compute-0 nova_compute[187787]: 2025-12-08 20:27:15.228 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:16 compute-0 nova_compute[187787]: 2025-12-08 20:27:16.963 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.838 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.839 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.842 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2f10e420>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.854 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.854 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.854 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.854 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:27:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:27:20 compute-0 nova_compute[187787]: 2025-12-08 20:27:20.232 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:20 compute-0 podman[223752]: 2025-12-08 20:27:20.524842406 +0000 UTC m=+0.087018833 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Dec 08 20:27:21 compute-0 nova_compute[187787]: 2025-12-08 20:27:21.965 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:25 compute-0 nova_compute[187787]: 2025-12-08 20:27:25.236 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:25 compute-0 podman[223773]: 2025-12-08 20:27:25.532837663 +0000 UTC m=+0.094075055 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64)
Dec 08 20:27:26 compute-0 nova_compute[187787]: 2025-12-08 20:27:26.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:26 compute-0 nova_compute[187787]: 2025-12-08 20:27:26.781 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:27:26 compute-0 nova_compute[187787]: 2025-12-08 20:27:26.967 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:28 compute-0 podman[223796]: 2025-12-08 20:27:28.494782432 +0000 UTC m=+0.061393849 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:27:28 compute-0 podman[223797]: 2025-12-08 20:27:28.509579657 +0000 UTC m=+0.072126086 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:27:29 compute-0 podman[202017]: time="2025-12-08T20:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:27:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:27:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.828 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.829 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.829 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:27:29 compute-0 nova_compute[187787]: 2025-12-08 20:27:29.830 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.059 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.061 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5632MB free_disk=72.8762321472168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.062 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.062 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.155 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.156 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.185 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.237 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.240 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.241 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:27:30 compute-0 nova_compute[187787]: 2025-12-08 20:27:30.241 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: ERROR   20:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: ERROR   20:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: ERROR   20:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: ERROR   20:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: ERROR   20:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:27:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:27:31 compute-0 nova_compute[187787]: 2025-12-08 20:27:31.969 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.242 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.243 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.243 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.263 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.264 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:32 compute-0 nova_compute[187787]: 2025-12-08 20:27:32.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:33 compute-0 nova_compute[187787]: 2025-12-08 20:27:33.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:34 compute-0 sshd-session[223838]: Invalid user kitchen from 200.155.38.219 port 24516
Dec 08 20:27:34 compute-0 sshd-session[223838]: Received disconnect from 200.155.38.219 port 24516:11: Bye Bye [preauth]
Dec 08 20:27:34 compute-0 sshd-session[223838]: Disconnected from invalid user kitchen 200.155.38.219 port 24516 [preauth]
Dec 08 20:27:35 compute-0 nova_compute[187787]: 2025-12-08 20:27:35.246 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:35 compute-0 sshd-session[223840]: Received disconnect from 47.76.127.165 port 39514:11: Bye Bye [preauth]
Dec 08 20:27:35 compute-0 sshd-session[223840]: Disconnected from authenticating user root 47.76.127.165 port 39514 [preauth]
Dec 08 20:27:35 compute-0 nova_compute[187787]: 2025-12-08 20:27:35.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:27:36 compute-0 sshd-session[223373]: ssh_dispatch_run_fatal: Connection from 45.78.228.32 port 53588: Connection timed out [preauth]
Dec 08 20:27:36 compute-0 nova_compute[187787]: 2025-12-08 20:27:36.972 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:37 compute-0 podman[223842]: 2025-12-08 20:27:37.500452983 +0000 UTC m=+0.067738028 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:27:37 compute-0 podman[223846]: 2025-12-08 20:27:37.538035203 +0000 UTC m=+0.084374160 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd)
Dec 08 20:27:37 compute-0 podman[223843]: 2025-12-08 20:27:37.568677615 +0000 UTC m=+0.120284068 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:27:40 compute-0 nova_compute[187787]: 2025-12-08 20:27:40.248 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:41 compute-0 nova_compute[187787]: 2025-12-08 20:27:41.974 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:45 compute-0 nova_compute[187787]: 2025-12-08 20:27:45.295 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:46 compute-0 nova_compute[187787]: 2025-12-08 20:27:46.976 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:50 compute-0 nova_compute[187787]: 2025-12-08 20:27:50.298 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:51 compute-0 podman[223915]: 2025-12-08 20:27:51.535755188 +0000 UTC m=+0.106311207 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:27:51 compute-0 sshd-session[223913]: Received disconnect from 193.46.255.7 port 50636:11:  [preauth]
Dec 08 20:27:51 compute-0 sshd-session[223913]: Disconnected from authenticating user root 193.46.255.7 port 50636 [preauth]
Dec 08 20:27:51 compute-0 nova_compute[187787]: 2025-12-08 20:27:51.979 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:27:55.005 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:27:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:27:55.006 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:27:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:27:55.006 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:27:55 compute-0 nova_compute[187787]: 2025-12-08 20:27:55.300 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:56 compute-0 podman[223935]: 2025-12-08 20:27:56.481183751 +0000 UTC m=+0.055662874 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, architecture=x86_64, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41)
Dec 08 20:27:56 compute-0 nova_compute[187787]: 2025-12-08 20:27:56.980 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:27:59 compute-0 podman[223956]: 2025-12-08 20:27:59.491762203 +0000 UTC m=+0.060008588 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:27:59 compute-0 podman[223957]: 2025-12-08 20:27:59.543548423 +0000 UTC m=+0.096925483 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 08 20:27:59 compute-0 podman[202017]: time="2025-12-08T20:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:27:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:27:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3471 "" "Go-http-client/1.1"
Dec 08 20:28:00 compute-0 nova_compute[187787]: 2025-12-08 20:28:00.303 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: ERROR   20:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: ERROR   20:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: ERROR   20:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: ERROR   20:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: ERROR   20:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:28:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:28:01 compute-0 nova_compute[187787]: 2025-12-08 20:28:01.981 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:05 compute-0 nova_compute[187787]: 2025-12-08 20:28:05.306 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:06 compute-0 nova_compute[187787]: 2025-12-08 20:28:06.984 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:07 compute-0 sshd-session[223999]: Received disconnect from 101.47.160.247 port 41740:11: Bye Bye [preauth]
Dec 08 20:28:07 compute-0 sshd-session[223999]: Disconnected from 101.47.160.247 port 41740 [preauth]
Dec 08 20:28:08 compute-0 podman[224001]: 2025-12-08 20:28:08.509236974 +0000 UTC m=+0.070832696 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 08 20:28:08 compute-0 podman[224003]: 2025-12-08 20:28:08.534131932 +0000 UTC m=+0.091828564 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 08 20:28:08 compute-0 podman[224002]: 2025-12-08 20:28:08.572635848 +0000 UTC m=+0.128391379 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 08 20:28:10 compute-0 nova_compute[187787]: 2025-12-08 20:28:10.308 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:11 compute-0 nova_compute[187787]: 2025-12-08 20:28:11.985 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:15 compute-0 nova_compute[187787]: 2025-12-08 20:28:15.313 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:16 compute-0 nova_compute[187787]: 2025-12-08 20:28:16.988 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:20 compute-0 nova_compute[187787]: 2025-12-08 20:28:20.316 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:21 compute-0 sshd-session[224069]: Connection closed by 45.78.228.32 port 53664 [preauth]
Dec 08 20:28:21 compute-0 nova_compute[187787]: 2025-12-08 20:28:21.990 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:22 compute-0 podman[224072]: 2025-12-08 20:28:22.510779625 +0000 UTC m=+0.078164656 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:28:25 compute-0 nova_compute[187787]: 2025-12-08 20:28:25.320 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:26 compute-0 nova_compute[187787]: 2025-12-08 20:28:26.990 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:27 compute-0 podman[224092]: 2025-12-08 20:28:27.530732068 +0000 UTC m=+0.102809868 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41, vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm)
Dec 08 20:28:27 compute-0 nova_compute[187787]: 2025-12-08 20:28:27.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:27 compute-0 nova_compute[187787]: 2025-12-08 20:28:27.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:28:29 compute-0 podman[202017]: time="2025-12-08T20:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:28:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:28:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Dec 08 20:28:29 compute-0 nova_compute[187787]: 2025-12-08 20:28:29.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:30 compute-0 nova_compute[187787]: 2025-12-08 20:28:30.324 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:30 compute-0 podman[224115]: 2025-12-08 20:28:30.510918049 +0000 UTC m=+0.069778874 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 08 20:28:30 compute-0 podman[224114]: 2025-12-08 20:28:30.524357629 +0000 UTC m=+0.082938745 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: ERROR   20:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: ERROR   20:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: ERROR   20:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: ERROR   20:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: ERROR   20:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:28:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.817 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.818 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.819 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:28:31 compute-0 nova_compute[187787]: 2025-12-08 20:28:31.992 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.030 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.031 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=72.8762321472168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.032 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.032 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.203 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.204 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.301 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.319 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.322 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:28:32 compute-0 nova_compute[187787]: 2025-12-08 20:28:32.322 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.321 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.322 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.322 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.374 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.374 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:33 compute-0 nova_compute[187787]: 2025-12-08 20:28:33.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:34 compute-0 nova_compute[187787]: 2025-12-08 20:28:34.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:35 compute-0 nova_compute[187787]: 2025-12-08 20:28:35.327 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:35 compute-0 nova_compute[187787]: 2025-12-08 20:28:35.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:36 compute-0 nova_compute[187787]: 2025-12-08 20:28:36.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:36 compute-0 nova_compute[187787]: 2025-12-08 20:28:36.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Dec 08 20:28:36 compute-0 nova_compute[187787]: 2025-12-08 20:28:36.994 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:38 compute-0 nova_compute[187787]: 2025-12-08 20:28:38.790 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:39 compute-0 podman[224156]: 2025-12-08 20:28:39.507683403 +0000 UTC m=+0.077473845 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:28:39 compute-0 podman[224157]: 2025-12-08 20:28:39.520898786 +0000 UTC m=+0.088763297 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Dec 08 20:28:39 compute-0 podman[224158]: 2025-12-08 20:28:39.530011972 +0000 UTC m=+0.084354020 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd)
Dec 08 20:28:40 compute-0 nova_compute[187787]: 2025-12-08 20:28:40.329 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:41 compute-0 nova_compute[187787]: 2025-12-08 20:28:41.997 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:42 compute-0 nova_compute[187787]: 2025-12-08 20:28:42.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:42 compute-0 nova_compute[187787]: 2025-12-08 20:28:42.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Dec 08 20:28:42 compute-0 nova_compute[187787]: 2025-12-08 20:28:42.801 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Dec 08 20:28:45 compute-0 nova_compute[187787]: 2025-12-08 20:28:45.333 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:45 compute-0 sshd-session[224221]: Received disconnect from 103.172.28.62 port 48696:11: Bye Bye [preauth]
Dec 08 20:28:45 compute-0 sshd-session[224221]: Disconnected from authenticating user root 103.172.28.62 port 48696 [preauth]
Dec 08 20:28:46 compute-0 nova_compute[187787]: 2025-12-08 20:28:46.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:47 compute-0 nova_compute[187787]: 2025-12-08 20:28:47.000 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:50 compute-0 nova_compute[187787]: 2025-12-08 20:28:50.335 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:52 compute-0 nova_compute[187787]: 2025-12-08 20:28:52.001 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:53 compute-0 podman[224224]: 2025-12-08 20:28:53.514997936 +0000 UTC m=+0.085349121 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 08 20:28:54 compute-0 nova_compute[187787]: 2025-12-08 20:28:54.103 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:28:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:28:55.006 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:28:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:28:55.007 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:28:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:28:55.007 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:28:55 compute-0 nova_compute[187787]: 2025-12-08 20:28:55.375 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:57 compute-0 nova_compute[187787]: 2025-12-08 20:28:57.003 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:28:58 compute-0 podman[224244]: 2025-12-08 20:28:58.483613523 +0000 UTC m=+0.061134703 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Dec 08 20:28:59 compute-0 podman[202017]: time="2025-12-08T20:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:28:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:28:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Dec 08 20:29:00 compute-0 nova_compute[187787]: 2025-12-08 20:29:00.378 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: ERROR   20:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: ERROR   20:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: ERROR   20:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: ERROR   20:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: ERROR   20:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:29:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:29:01 compute-0 podman[224265]: 2025-12-08 20:29:01.514509241 +0000 UTC m=+0.071271521 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 08 20:29:01 compute-0 podman[224266]: 2025-12-08 20:29:01.538793581 +0000 UTC m=+0.092175895 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:29:02 compute-0 nova_compute[187787]: 2025-12-08 20:29:02.006 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:05 compute-0 nova_compute[187787]: 2025-12-08 20:29:05.382 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:05 compute-0 sshd-session[224307]: Received disconnect from 47.76.127.165 port 35788:11: Bye Bye [preauth]
Dec 08 20:29:05 compute-0 sshd-session[224307]: Disconnected from authenticating user root 47.76.127.165 port 35788 [preauth]
Dec 08 20:29:07 compute-0 nova_compute[187787]: 2025-12-08 20:29:07.008 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:10 compute-0 nova_compute[187787]: 2025-12-08 20:29:10.384 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:10 compute-0 podman[224309]: 2025-12-08 20:29:10.513161555 +0000 UTC m=+0.077971881 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 08 20:29:10 compute-0 podman[224311]: 2025-12-08 20:29:10.517731457 +0000 UTC m=+0.081839301 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 08 20:29:10 compute-0 podman[224310]: 2025-12-08 20:29:10.545732553 +0000 UTC m=+0.109011211 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 08 20:29:12 compute-0 nova_compute[187787]: 2025-12-08 20:29:12.009 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:15 compute-0 nova_compute[187787]: 2025-12-08 20:29:15.431 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:17 compute-0 nova_compute[187787]: 2025-12-08 20:29:17.011 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.838 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.839 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.839 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.840 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.840 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.841 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.843 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.843 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.845 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.846 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.848 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.852 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.854 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.854 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.854 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': [], 'disk.device.allocation': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.854 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.857 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.858 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.859 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.860 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:29:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:29:20 compute-0 sshd-session[224374]: Received disconnect from 200.155.38.219 port 25391:11: Bye Bye [preauth]
Dec 08 20:29:20 compute-0 sshd-session[224374]: Disconnected from authenticating user root 200.155.38.219 port 25391 [preauth]
Dec 08 20:29:20 compute-0 nova_compute[187787]: 2025-12-08 20:29:20.433 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:22 compute-0 nova_compute[187787]: 2025-12-08 20:29:22.013 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:24 compute-0 podman[224377]: 2025-12-08 20:29:24.500811012 +0000 UTC m=+0.070527007 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Dec 08 20:29:25 compute-0 nova_compute[187787]: 2025-12-08 20:29:25.437 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:27 compute-0 nova_compute[187787]: 2025-12-08 20:29:27.016 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:27 compute-0 nova_compute[187787]: 2025-12-08 20:29:27.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:27 compute-0 nova_compute[187787]: 2025-12-08 20:29:27.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:29:29 compute-0 sshd-session[224398]: Received disconnect from 45.78.217.210 port 37892:11: Bye Bye [preauth]
Dec 08 20:29:29 compute-0 sshd-session[224398]: Disconnected from authenticating user root 45.78.217.210 port 37892 [preauth]
Dec 08 20:29:29 compute-0 podman[224400]: 2025-12-08 20:29:29.518578047 +0000 UTC m=+0.082249764 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Dec 08 20:29:29 compute-0 podman[202017]: time="2025-12-08T20:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:29:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:29:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3485 "" "Go-http-client/1.1"
Dec 08 20:29:30 compute-0 nova_compute[187787]: 2025-12-08 20:29:30.442 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: ERROR   20:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: ERROR   20:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: ERROR   20:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: ERROR   20:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: ERROR   20:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:29:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:29:31 compute-0 nova_compute[187787]: 2025-12-08 20:29:31.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:31 compute-0 nova_compute[187787]: 2025-12-08 20:29:31.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.017 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:32 compute-0 podman[224422]: 2025-12-08 20:29:32.504029383 +0000 UTC m=+0.069997401 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:29:32 compute-0 podman[224423]: 2025-12-08 20:29:32.509139074 +0000 UTC m=+0.071138757 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.806 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.807 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.835 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.835 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.836 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:29:32 compute-0 nova_compute[187787]: 2025-12-08 20:29:32.836 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.044 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.048 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5640MB free_disk=72.8762321472168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.048 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.049 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.127 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.127 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.254 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.273 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.276 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:29:33 compute-0 nova_compute[187787]: 2025-12-08 20:29:33.276 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:29:35 compute-0 nova_compute[187787]: 2025-12-08 20:29:35.249 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:35 compute-0 nova_compute[187787]: 2025-12-08 20:29:35.445 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:35 compute-0 nova_compute[187787]: 2025-12-08 20:29:35.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:35 compute-0 nova_compute[187787]: 2025-12-08 20:29:35.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:37 compute-0 nova_compute[187787]: 2025-12-08 20:29:37.020 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:37 compute-0 nova_compute[187787]: 2025-12-08 20:29:37.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:29:40 compute-0 nova_compute[187787]: 2025-12-08 20:29:40.447 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:41 compute-0 podman[224465]: 2025-12-08 20:29:41.520413131 +0000 UTC m=+0.083462252 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:29:41 compute-0 podman[224467]: 2025-12-08 20:29:41.536816684 +0000 UTC m=+0.091250206 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 08 20:29:41 compute-0 podman[224466]: 2025-12-08 20:29:41.564177739 +0000 UTC m=+0.131458313 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 08 20:29:42 compute-0 nova_compute[187787]: 2025-12-08 20:29:42.022 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:45 compute-0 nova_compute[187787]: 2025-12-08 20:29:45.453 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:47 compute-0 nova_compute[187787]: 2025-12-08 20:29:47.024 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:50 compute-0 nova_compute[187787]: 2025-12-08 20:29:50.456 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:52 compute-0 nova_compute[187787]: 2025-12-08 20:29:52.028 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:29:55.007 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:29:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:29:55.008 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:29:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:29:55.008 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:29:55 compute-0 nova_compute[187787]: 2025-12-08 20:29:55.515 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:55 compute-0 podman[224533]: 2025-12-08 20:29:55.567786816 +0000 UTC m=+0.140268510 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=edpm)
Dec 08 20:29:57 compute-0 nova_compute[187787]: 2025-12-08 20:29:57.029 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:29:59 compute-0 podman[202017]: time="2025-12-08T20:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:29:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:29:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Dec 08 20:30:00 compute-0 podman[224554]: 2025-12-08 20:30:00.501295947 +0000 UTC m=+0.068459984 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 08 20:30:00 compute-0 nova_compute[187787]: 2025-12-08 20:30:00.518 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: ERROR   20:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: ERROR   20:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: ERROR   20:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: ERROR   20:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: ERROR   20:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:30:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:30:02 compute-0 nova_compute[187787]: 2025-12-08 20:30:02.034 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:03 compute-0 podman[224575]: 2025-12-08 20:30:03.502851667 +0000 UTC m=+0.066594594 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:30:03 compute-0 podman[224576]: 2025-12-08 20:30:03.511927251 +0000 UTC m=+0.076653390 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 08 20:30:05 compute-0 nova_compute[187787]: 2025-12-08 20:30:05.523 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:07 compute-0 nova_compute[187787]: 2025-12-08 20:30:07.036 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:10 compute-0 nova_compute[187787]: 2025-12-08 20:30:10.528 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:12 compute-0 nova_compute[187787]: 2025-12-08 20:30:12.039 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:12 compute-0 podman[224619]: 2025-12-08 20:30:12.511998257 +0000 UTC m=+0.078041632 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:30:12 compute-0 podman[224621]: 2025-12-08 20:30:12.578930781 +0000 UTC m=+0.144966256 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:30:12 compute-0 podman[224620]: 2025-12-08 20:30:12.596632814 +0000 UTC m=+0.163466094 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 08 20:30:15 compute-0 nova_compute[187787]: 2025-12-08 20:30:15.532 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:17 compute-0 nova_compute[187787]: 2025-12-08 20:30:17.042 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:20 compute-0 nova_compute[187787]: 2025-12-08 20:30:20.541 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:22 compute-0 nova_compute[187787]: 2025-12-08 20:30:22.045 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:22 compute-0 sshd-session[224687]: Invalid user administrator from 103.172.28.62 port 48026
Dec 08 20:30:23 compute-0 sshd-session[224687]: Received disconnect from 103.172.28.62 port 48026:11: Bye Bye [preauth]
Dec 08 20:30:23 compute-0 sshd-session[224687]: Disconnected from invalid user administrator 103.172.28.62 port 48026 [preauth]
Dec 08 20:30:24 compute-0 sshd-session[224685]: Received disconnect from 101.47.160.247 port 59810:11: Bye Bye [preauth]
Dec 08 20:30:24 compute-0 sshd-session[224685]: Disconnected from authenticating user root 101.47.160.247 port 59810 [preauth]
Dec 08 20:30:25 compute-0 nova_compute[187787]: 2025-12-08 20:30:25.546 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:26 compute-0 podman[224689]: 2025-12-08 20:30:26.528700364 +0000 UTC m=+0.090391719 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42)
Dec 08 20:30:27 compute-0 nova_compute[187787]: 2025-12-08 20:30:27.046 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:27 compute-0 nova_compute[187787]: 2025-12-08 20:30:27.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:27 compute-0 nova_compute[187787]: 2025-12-08 20:30:27.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:30:29 compute-0 podman[202017]: time="2025-12-08T20:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:30:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:30:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3477 "" "Go-http-client/1.1"
Dec 08 20:30:30 compute-0 nova_compute[187787]: 2025-12-08 20:30:30.551 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: ERROR   20:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: ERROR   20:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: ERROR   20:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: ERROR   20:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: ERROR   20:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:30:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:30:31 compute-0 podman[224710]: 2025-12-08 20:30:31.499319304 +0000 UTC m=+0.069006239 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1755695350, io.openshift.tags=minimal rhel9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 08 20:30:31 compute-0 nova_compute[187787]: 2025-12-08 20:30:31.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.048 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.811 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.811 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.812 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.812 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.985 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.986 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=72.87656021118164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.986 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:30:32 compute-0 nova_compute[187787]: 2025-12-08 20:30:32.987 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.049 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.050 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.073 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing inventories for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.094 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating ProviderTree inventory for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.094 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Updating inventory in ProviderTree for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.113 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing aggregate associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.156 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Refreshing trait associations for resource provider b3899b98-89be-4b90-bd85-9c57a93a16c4, traits: COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SSE42,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE41,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_STORAGE_BUS_IDE,COMPUTE_VOLUME_EXTEND,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSSE3,COMPUTE_ACCELERATORS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_RESCUE_BFV,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_BMI,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_AESNI,HW_CPU_X86_FMA3,COMPUTE_NODE,HW_CPU_X86_SSE2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SHA,HW_CPU_X86_F16C,HW_CPU_X86_AMD_SVM,HW_CPU_X86_AVX2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE,HW_CPU_X86_SSE4A,HW_CPU_X86_ABM _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.183 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.200 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.202 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:30:33 compute-0 nova_compute[187787]: 2025-12-08 20:30:33.202 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:30:34 compute-0 nova_compute[187787]: 2025-12-08 20:30:34.202 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:34 compute-0 nova_compute[187787]: 2025-12-08 20:30:34.203 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:30:34 compute-0 nova_compute[187787]: 2025-12-08 20:30:34.203 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:30:34 compute-0 nova_compute[187787]: 2025-12-08 20:30:34.224 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:30:34 compute-0 podman[224732]: 2025-12-08 20:30:34.502702093 +0000 UTC m=+0.068148713 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 08 20:30:34 compute-0 podman[224731]: 2025-12-08 20:30:34.508969428 +0000 UTC m=+0.071698603 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:30:35 compute-0 nova_compute[187787]: 2025-12-08 20:30:35.554 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:35 compute-0 nova_compute[187787]: 2025-12-08 20:30:35.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:36 compute-0 sshd-session[224771]: Received disconnect from 47.76.127.165 port 51226:11: Bye Bye [preauth]
Dec 08 20:30:36 compute-0 sshd-session[224771]: Disconnected from authenticating user root 47.76.127.165 port 51226 [preauth]
Dec 08 20:30:36 compute-0 nova_compute[187787]: 2025-12-08 20:30:36.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:37 compute-0 nova_compute[187787]: 2025-12-08 20:30:37.051 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:37 compute-0 nova_compute[187787]: 2025-12-08 20:30:37.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:37 compute-0 nova_compute[187787]: 2025-12-08 20:30:37.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:40 compute-0 nova_compute[187787]: 2025-12-08 20:30:40.558 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:40 compute-0 nova_compute[187787]: 2025-12-08 20:30:40.776 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:30:42 compute-0 nova_compute[187787]: 2025-12-08 20:30:42.053 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:43 compute-0 podman[224775]: 2025-12-08 20:30:43.501332824 +0000 UTC m=+0.065271422 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 08 20:30:43 compute-0 podman[224773]: 2025-12-08 20:30:43.510226253 +0000 UTC m=+0.071129986 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 08 20:30:43 compute-0 podman[224774]: 2025-12-08 20:30:43.562915831 +0000 UTC m=+0.120342576 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 08 20:30:45 compute-0 nova_compute[187787]: 2025-12-08 20:30:45.561 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:47 compute-0 nova_compute[187787]: 2025-12-08 20:30:47.055 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:50 compute-0 nova_compute[187787]: 2025-12-08 20:30:50.565 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:52 compute-0 nova_compute[187787]: 2025-12-08 20:30:52.058 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:30:55.009 105024 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:30:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:30:55.010 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:30:55 compute-0 ovn_metadata_agent[105019]: 2025-12-08 20:30:55.010 105024 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:30:55 compute-0 nova_compute[187787]: 2025-12-08 20:30:55.569 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:57 compute-0 nova_compute[187787]: 2025-12-08 20:30:57.061 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:30:57 compute-0 podman[224843]: 2025-12-08 20:30:57.495134115 +0000 UTC m=+0.071983312 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, tcib_managed=true, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 08 20:30:59 compute-0 podman[202017]: time="2025-12-08T20:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:30:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:30:59 compute-0 podman[202017]: @ - - [08/Dec/2025:20:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3476 "" "Go-http-client/1.1"
Dec 08 20:31:00 compute-0 nova_compute[187787]: 2025-12-08 20:31:00.573 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: ERROR   20:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: ERROR   20:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: ERROR   20:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: ERROR   20:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: ERROR   20:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:31:01 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:31:02 compute-0 nova_compute[187787]: 2025-12-08 20:31:02.062 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:02 compute-0 podman[224865]: 2025-12-08 20:31:02.529739997 +0000 UTC m=+0.087415825 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Dec 08 20:31:05 compute-0 podman[224887]: 2025-12-08 20:31:05.487657762 +0000 UTC m=+0.055305261 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 08 20:31:05 compute-0 podman[224888]: 2025-12-08 20:31:05.518192877 +0000 UTC m=+0.072911052 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 08 20:31:05 compute-0 nova_compute[187787]: 2025-12-08 20:31:05.577 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:07 compute-0 nova_compute[187787]: 2025-12-08 20:31:07.065 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:08 compute-0 sshd-session[224931]: Invalid user sam from 200.155.38.219 port 10035
Dec 08 20:31:08 compute-0 sshd-session[224931]: Received disconnect from 200.155.38.219 port 10035:11: Bye Bye [preauth]
Dec 08 20:31:08 compute-0 sshd-session[224931]: Disconnected from invalid user sam 200.155.38.219 port 10035 [preauth]
Dec 08 20:31:10 compute-0 nova_compute[187787]: 2025-12-08 20:31:10.582 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:12 compute-0 nova_compute[187787]: 2025-12-08 20:31:12.069 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:14 compute-0 podman[224933]: 2025-12-08 20:31:14.507164246 +0000 UTC m=+0.072044284 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 08 20:31:14 compute-0 podman[224935]: 2025-12-08 20:31:14.525918593 +0000 UTC m=+0.080869531 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 08 20:31:14 compute-0 podman[224934]: 2025-12-08 20:31:14.539121006 +0000 UTC m=+0.107937758 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Dec 08 20:31:15 compute-0 nova_compute[187787]: 2025-12-08 20:31:15.631 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:17 compute-0 nova_compute[187787]: 2025-12-08 20:31:17.070 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.839 14 DEBUG ceilometer.polling.manager [-] The number of pollsters in source [pollsters] is bigger than the number of worker threads to execute them. Therefore, one can expect the process to be longer than the expected. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:253
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.840 14 DEBUG ceilometer.polling.manager [-] Processing pollsters for [pollsters] with [1] threads. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:262
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.841 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3050>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.842 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskReadLatencyPollster object at 0x7f4d2b0a3020>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.842 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a30e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b130110>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3140>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{}], and discovery cache [{}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.843 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a31a0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.844 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.844 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a39e0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.844 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDevicePhysicalPollster object at 0x7f4d2b0a30b0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.845 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3200>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.845 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.846 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.PowerStatePollster object at 0x7f4d2b1300e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.846 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3260>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.847 14 DEBUG ceilometer.polling.manager [-] Skip pollster power.state, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.847 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3a70>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.847 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteBytesPollster object at 0x7f4d2b0a3110>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a32c0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.848 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.848 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2df922d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.849 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceDiskWriteLatencyPollster object at 0x7f4d2b0a3170>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.849 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3ad0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.849 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2d8182f0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.850 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesRatePollster object at 0x7f4d2b0a3560>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.850 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3b60>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.850 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2c2a9be0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.851 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceWriteRequestsPollster object at 0x7f4d2b0a31d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.851 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3bf0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.851 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3c80>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.852 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingPacketsPollster object at 0x7f4d2b0a3800>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.852 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a34d0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.853 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.853 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3d10>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.853 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.EphemeralSizePollster object at 0x7f4d2b0a3230>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.854 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3530>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.854 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.ephemeral.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.854 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2e5be540>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.855 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingDropPollster object at 0x7f4d2b0a3a40>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.855 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3da0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.855 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.856 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a3e30>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.856 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.RootSizePollster object at 0x7f4d2b0a3290>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.856 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2fc0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.856 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.root.size, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.857 14 DEBUG ceilometer.polling.manager [-] Registering pollster [<stevedore.extension.Extension object at 0x7f4d2b0a2ff0>] from source [pollsters] to be executed via executor [<concurrent.futures.thread.ThreadPoolExecutor object at 0x7f4d2ac100e0>] with cache [{}], pollster history [{'disk.device.read.latency': [], 'disk.device.usage': [], 'power.state': [], 'disk.device.write.bytes': [], 'disk.device.write.latency': [], 'network.incoming.bytes.rate': [], 'disk.device.write.requests': [], 'network.incoming.packets': [], 'disk.ephemeral.size': [], 'network.incoming.packets.drop': [], 'disk.root.size': []}], and discovery cache [{'local_instances': []}]. register_pollster_execution /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:276
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.857 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceAllocationPollster object at 0x7f4d2e9684d0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingErrorsPollster object at 0x7f4d2b0a3aa0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesDeltaPollster object at 0x7f4d2b0a3500>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesPollster object at 0x7f4d2b0a3b30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.858 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadRequestsPollster object at 0x7f4d2b0a3080>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesDeltaPollster object at 0x7f4d2b0a3bc0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingBytesRatePollster object at 0x7f4d2b0a3c50>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.MemoryUsagePollster object at 0x7f4d2b0a34a0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.859 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingPacketsPollster object at 0x7f4d2b0a3ce0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.IncomingBytesPollster object at 0x7f4d2c21da30>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.instance_stats.CPUPollster object at 0x7f4d2b0a15e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingDropPollster object at 0x7f4d2b0a3d70>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.860 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.net.OutgoingErrorsPollster object at 0x7f4d2b0a3e00>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceCapacityPollster object at 0x7f4d2c36e1e0>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Executing discovery process for pollsters [<ceilometer.compute.pollsters.disk.PerDeviceReadBytesPollster object at 0x7f4d2b0a2f90>] and discovery method [local_instances] via process [<bound method AgentManager.discover of <ceilometer.polling.manager.AgentManager object at 0x7f4d2c247950>>]. _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:294
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle _internal_pollster_run /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:321
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.861 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [power.state]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.latency]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.write.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.ephemeral.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.862 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.root.size]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.allocation]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.requests]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.delta]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.bytes.rate]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [memory.usage]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.incoming.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.863 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [cpu]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.864 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.drop]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.864 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [network.outgoing.packets.error]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.864 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.capacity]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:19 compute-0 ceilometer_agent_compute[198537]: 2025-12-08 20:31:19.864 14 DEBUG ceilometer.polling.manager [-] Finished processing pollster [disk.device.read.bytes]. execute_polling_task_processing /usr/lib/python3.12/site-packages/ceilometer/polling/manager.py:272
Dec 08 20:31:20 compute-0 nova_compute[187787]: 2025-12-08 20:31:20.637 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:22 compute-0 nova_compute[187787]: 2025-12-08 20:31:22.073 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:25 compute-0 nova_compute[187787]: 2025-12-08 20:31:25.640 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:27 compute-0 nova_compute[187787]: 2025-12-08 20:31:27.076 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:28 compute-0 podman[225005]: 2025-12-08 20:31:28.511555688 +0000 UTC m=+0.079758657 container health_status 7ac6d8f2fd67c1c60638d4d2c203e905dc7961008d85b860b2ba7d65ddc47964 (image=quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=3a7876c5b6a4ff2e2bc50e11e9db5f42, io.buildah.version=1.41.4, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-compute:current-tested', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 10 Base Image)
Dec 08 20:31:28 compute-0 nova_compute[187787]: 2025-12-08 20:31:28.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:28 compute-0 nova_compute[187787]: 2025-12-08 20:31:28.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Dec 08 20:31:29 compute-0 podman[202017]: time="2025-12-08T20:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 08 20:31:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 22543 "" "Go-http-client/1.1"
Dec 08 20:31:29 compute-0 podman[202017]: @ - - [08/Dec/2025:20:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Dec 08 20:31:30 compute-0 nova_compute[187787]: 2025-12-08 20:31:30.645 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: ERROR   20:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: ERROR   20:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: ERROR   20:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: ERROR   20:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: ERROR   20:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Dec 08 20:31:31 compute-0 openstack_network_exporter[204149]: 
Dec 08 20:31:31 compute-0 nova_compute[187787]: 2025-12-08 20:31:31.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:32 compute-0 nova_compute[187787]: 2025-12-08 20:31:32.076 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:32 compute-0 sshd-session[225025]: Accepted publickey for zuul from 192.168.122.10 port 59888 ssh2: ECDSA SHA256:plADixKgMPl0DXcCt4KOPae0gEa5a51cLFSZBYisJLw
Dec 08 20:31:32 compute-0 systemd-logind[793]: New session 29 of user zuul.
Dec 08 20:31:32 compute-0 systemd[1]: Started Session 29 of User zuul.
Dec 08 20:31:32 compute-0 sshd-session[225025]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Dec 08 20:31:32 compute-0 podman[225027]: 2025-12-08 20:31:32.740310389 +0000 UTC m=+0.080164539 container health_status adf874435ddf42f4572b7fbafbf7bc6dc4a8adb3a9c661441af895e3fe359aa8 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Dec 08 20:31:32 compute-0 nova_compute[187787]: 2025-12-08 20:31:32.775 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:32 compute-0 sudo[225050]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Dec 08 20:31:32 compute-0 sudo[225050]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.780 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.797 187791 DEBUG nova.compute.manager [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.798 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.851 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.852 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.852 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:31:34 compute-0 nova_compute[187787]: 2025-12-08 20:31:34.852 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.073 187791 WARNING nova.virt.libvirt.driver [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.075 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5619MB free_disk=72.87643432617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.076 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.077 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.600 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.600 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7680MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.629 187791 DEBUG nova.compute.provider_tree [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed in ProviderTree for provider: b3899b98-89be-4b90-bd85-9c57a93a16c4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.649 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.655 187791 DEBUG nova.scheduler.client.report [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Inventory has not changed for provider b3899b98-89be-4b90-bd85-9c57a93a16c4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7680, 'reserved': 512, 'min_unit': 1, 'max_unit': 7680, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.657 187791 DEBUG nova.compute.resource_tracker [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Dec 08 20:31:35 compute-0 nova_compute[187787]: 2025-12-08 20:31:35.657 187791 DEBUG oslo_concurrency.lockutils [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Dec 08 20:31:36 compute-0 podman[225194]: 2025-12-08 20:31:36.545915173 +0000 UTC m=+0.069479334 container health_status 2ba95417a9ccb9fff95a31018ff1caecc34c0adddf74923419e3f41faa343398 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 08 20:31:36 compute-0 podman[225193]: 2025-12-08 20:31:36.575365634 +0000 UTC m=+0.098259094 container health_status 0223112eab84324c03f1bbbdf29c43d5bf57a2e66984ce0058ad8852076e85d8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 08 20:31:36 compute-0 nova_compute[187787]: 2025-12-08 20:31:36.642 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:36 compute-0 nova_compute[187787]: 2025-12-08 20:31:36.779 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:37 compute-0 nova_compute[187787]: 2025-12-08 20:31:37.079 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:38 compute-0 ovs-vsctl[225265]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 08 20:31:38 compute-0 nova_compute[187787]: 2025-12-08 20:31:38.780 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:38 compute-0 nova_compute[187787]: 2025-12-08 20:31:38.781 187791 DEBUG oslo_service.periodic_task [None req-2ba30738-de97-4d3c-82f4-a99b9a6ac2b5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Dec 08 20:31:39 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 08 20:31:39 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 08 20:31:39 compute-0 virtqemud[187722]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 08 20:31:40 compute-0 crontab[225684]: (root) LIST (root)
Dec 08 20:31:40 compute-0 nova_compute[187787]: 2025-12-08 20:31:40.653 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:42 compute-0 nova_compute[187787]: 2025-12-08 20:31:42.081 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Dec 08 20:31:42 compute-0 systemd[1]: Starting Hostname Service...
Dec 08 20:31:42 compute-0 systemd[1]: Started Hostname Service.
Dec 08 20:31:44 compute-0 podman[225949]: 2025-12-08 20:31:44.775971201 +0000 UTC m=+0.090710508 container health_status c6039b79b01c42dbd1a3cf42b76e9e3e83725aa3b5a4386d40a188d72153e03c (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true)
Dec 08 20:31:44 compute-0 podman[225945]: 2025-12-08 20:31:44.787666817 +0000 UTC m=+0.106447141 container health_status 5d7ee6a8f4d38866c07cb8ba255994992221882654fe054ec8d40186dc132997 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 08 20:31:44 compute-0 podman[225948]: 2025-12-08 20:31:44.795221013 +0000 UTC m=+0.114174033 container health_status b47d3cc48f454cb5ec4a834f4767de821651fb6e2e482de5011322c6f38581c1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 08 20:31:45 compute-0 nova_compute[187787]: 2025-12-08 20:31:45.657 187791 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 30 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
